Compare commits
2027 Commits
Author | SHA1 | Date | |
---|---|---|---|
![]() |
9d117ee11b | ||
![]() |
7d4b2c2413 | ||
![]() |
5bb1824613 | ||
![]() |
8c07b76e6a | ||
![]() |
426178b6e5 | ||
![]() |
a865f2af7d | ||
![]() |
3409d19139 | ||
![]() |
8967f07c8d | ||
![]() |
6e21d3dbee | ||
![]() |
45b3422506 | ||
![]() |
df7e4d85c6 | ||
![]() |
ae736f8f68 | ||
![]() |
e2674c29a6 | ||
![]() |
58afac2312 | ||
![]() |
ea60b83336 | ||
![]() |
09139fe434 | ||
![]() |
7d4ce40a37 | ||
![]() |
8feada6907 | ||
![]() |
ed9b0c32d8 | ||
![]() |
27906df149 | ||
![]() |
b8e7f0b45f | ||
![]() |
355b3fcb3d | ||
![]() |
7aa0e5650b | ||
![]() |
f9a0adc64e | ||
![]() |
3b34aed64f | ||
![]() |
3a7cbd3a42 | ||
![]() |
0e443ba017 | ||
![]() |
8ed401aec1 | ||
![]() |
9ae847039b | ||
![]() |
d4cb84ff76 | ||
![]() |
17ae2aacbf | ||
![]() |
82d03f2dc6 | ||
![]() |
3cf2aaf8ff | ||
![]() |
0c89721133 | ||
![]() |
e206687070 | ||
![]() |
cdb9c48545 | ||
![]() |
0b8eff9643 | ||
![]() |
16882b8fa9 | ||
![]() |
2afa5940e3 | ||
![]() |
8fa7bc3dab | ||
![]() |
4a9dc1e33a | ||
![]() |
b0d842a370 | ||
![]() |
194dda6d84 | ||
![]() |
3267708097 | ||
![]() |
71a808c95a | ||
![]() |
2c3c26edf1 | ||
![]() |
18a4ba7778 | ||
![]() |
140e239bdb | ||
![]() |
537e7c63f4 | ||
![]() |
ed2e884de8 | ||
![]() |
60980cb26a | ||
![]() |
3b84e34c8e | ||
![]() |
ebdf9b55df | ||
![]() |
f528b01de4 | ||
![]() |
6ae9a8f2be | ||
![]() |
65cfd55027 | ||
![]() |
15d074d39c | ||
![]() |
962d0ebb40 | ||
![]() |
d408900a91 | ||
![]() |
0bf9e55ca7 | ||
![]() |
55d36b39b1 | ||
![]() |
45fd01a688 | ||
![]() |
1fda0782ae | ||
![]() |
2680a83455 | ||
![]() |
148e875523 | ||
![]() |
b9e60e0145 | ||
![]() |
03559454f6 | ||
![]() |
50ee1c0bd5 | ||
![]() |
c6b13271cf | ||
![]() |
fd83e8f2a9 | ||
![]() |
5f7c724531 | ||
![]() |
31ebe2675b | ||
![]() |
60d40cc2ce | ||
![]() |
6796cdf947 | ||
![]() |
6592a925c4 | ||
![]() |
6d42b2a29d | ||
![]() |
6b4dccfbd5 | ||
![]() |
2582d325bc | ||
![]() |
078814e77a | ||
![]() |
a80b413d38 | ||
![]() |
8cb93da53e | ||
![]() |
299ae2b828 | ||
![]() |
96cf316eec | ||
![]() |
4aebb8e153 | ||
![]() |
82c6942f09 | ||
![]() |
ff280f0309 | ||
![]() |
06d15a11c8 | ||
![]() |
4b3649ea94 | ||
![]() |
5ffb25b71d | ||
![]() |
8b28159e2d | ||
![]() |
99d6103617 | ||
![]() |
a502fe7c5c | ||
![]() |
bd18a57a5d | ||
![]() |
3828d712bd | ||
![]() |
a6be010464 | ||
![]() |
efe51b30fe | ||
![]() |
26019b9c17 | ||
![]() |
ebdd7afb67 | ||
![]() |
0c90b84a92 | ||
![]() |
9c2265d1aa | ||
![]() |
8046a6f3a7 | ||
![]() |
ddff902291 | ||
![]() |
a406920ae6 | ||
![]() |
44e596b0c4 | ||
![]() |
2b1c8c8d9a | ||
![]() |
d40c13420d | ||
![]() |
8ad2f7daf0 | ||
![]() |
696ebf545f | ||
![]() |
ed515f4e36 | ||
![]() |
06a4949266 | ||
![]() |
f50a01e118 | ||
![]() |
09512be1ad | ||
![]() |
bb951ad860 | ||
![]() |
858ae909e8 | ||
![]() |
1692bac3fe | ||
![]() |
cce1595c3d | ||
![]() |
67bb140eef | ||
![]() |
6d5d308d6c | ||
![]() |
d39b4ae8cb | ||
![]() |
0f4b118b61 | ||
![]() |
f5f2240828 | ||
![]() |
66593a28f5 | ||
![]() |
ba1cdd5914 | ||
![]() |
3f536552a6 | ||
![]() |
1d2282df9e | ||
![]() |
1b56ffd0c0 | ||
![]() |
d5018af2a3 | ||
![]() |
5c1e09cc48 | ||
![]() |
6d956ac13b | ||
![]() |
865fbbd15c | ||
![]() |
97cfd0085e | ||
![]() |
f20f200c8d | ||
![]() |
78bd424ecb | ||
![]() |
6fa32c36e9 | ||
![]() |
817882ff6f | ||
![]() |
d1e8299010 | ||
![]() |
42c50c4e0b | ||
![]() |
c151db6e21 | ||
![]() |
7844537355 | ||
![]() |
a414208327 | ||
![]() |
ab761e837c | ||
![]() |
c8e838e3a0 | ||
![]() |
9b24cf7591 | ||
![]() |
5e3cdcdd6e | ||
![]() |
baeb2a074a | ||
![]() |
a56de4547c | ||
![]() |
e3cc5c3013 | ||
![]() |
4194b248b9 | ||
![]() |
3fcbd8f3ac | ||
![]() |
b3b2519bf0 | ||
![]() |
fccea022fa | ||
![]() |
d80d5e4e70 | ||
![]() |
f1e93eb70a | ||
![]() |
260b709296 | ||
![]() |
9bb762fc8f | ||
![]() |
5c49bbfc73 | ||
![]() |
edabf208bc | ||
![]() |
0878a199f4 | ||
![]() |
eec1f03f86 | ||
![]() |
a4c4b81297 | ||
![]() |
a5f9c8f651 | ||
![]() |
1ec7351842 | ||
![]() |
f2cab81aed | ||
![]() |
9cbc74ebb2 | ||
![]() |
3d36d0445c | ||
![]() |
f4fece5550 | ||
![]() |
6133f745b7 | ||
![]() |
02456b271b | ||
![]() |
ad1f5ae081 | ||
![]() |
86358d5561 | ||
![]() |
87953cb98a | ||
![]() |
f58e0041ce | ||
![]() |
8183de4902 | ||
![]() |
a8c575147b | ||
![]() |
09fcc28ded | ||
![]() |
22fb659b72 | ||
![]() |
40ae184c4e | ||
![]() |
18a2a41682 | ||
![]() |
4faff70b5d | ||
![]() |
8d3361766d | ||
![]() |
5dd4d0c370 | ||
![]() |
a0617c1fad | ||
![]() |
9e9593b899 | ||
![]() |
89b7270233 | ||
![]() |
3af4808864 | ||
![]() |
d4c3b7614d | ||
![]() |
676ba9ca22 | ||
![]() |
d0f5cc839f | ||
![]() |
110bd65c20 | ||
![]() |
ef0080b0a9 | ||
![]() |
a81dc00ccf | ||
![]() |
765fea7f7e | ||
![]() |
a0f48130c0 | ||
![]() |
7e2c693c8a | ||
![]() |
7396e4c326 | ||
![]() |
0175eab031 | ||
![]() |
3d0a26fdb1 | ||
![]() |
c52d18da1f | ||
![]() |
e0f341938a | ||
![]() |
a037e562b2 | ||
![]() |
f1084cbdcf | ||
![]() |
f6e4339069 | ||
![]() |
a754c6047d | ||
![]() |
a5d2ae2588 | ||
![]() |
a6f3378c21 | ||
![]() |
ca75fb5664 | ||
![]() |
32861ad592 | ||
![]() |
ada8516803 | ||
![]() |
d5c27a95aa | ||
![]() |
6b8a21d2b0 | ||
![]() |
cb7e6f8cd0 | ||
![]() |
f48a2cb65e | ||
![]() |
0fdd3d56f4 | ||
![]() |
173934258c | ||
![]() |
b70e21a6d5 | ||
![]() |
cafb884991 | ||
![]() |
94b09614d9 | ||
![]() |
7488505e37 | ||
![]() |
641ff9a71d | ||
![]() |
b596502f74 | ||
![]() |
4fbd760005 | ||
![]() |
6a5ac15b07 | ||
![]() |
57b419fa87 | ||
![]() |
52ce54930b | ||
![]() |
fa9898ebe1 | ||
![]() |
de4b3c39b9 | ||
![]() |
232e358a34 | ||
![]() |
a0d35f9262 | ||
![]() |
8883ef81dc | ||
![]() |
39a425eca4 | ||
![]() |
e31d383cfc | ||
![]() |
cb104b60f4 | ||
![]() |
b88963c900 | ||
![]() |
3597abbcd7 | ||
![]() |
0f780b6271 | ||
![]() |
f1ce4e1f5b | ||
![]() |
5d0d800c0a | ||
![]() |
3b330ef22f | ||
![]() |
873bb4fd2d | ||
![]() |
3973df64ba | ||
![]() |
8cda62ae92 | ||
![]() |
b8e1a49f85 | ||
![]() |
f9147b5405 | ||
![]() |
878f727a2c | ||
![]() |
0169ee4885 | ||
![]() |
904faf27c2 | ||
![]() |
d2a38fe05c | ||
![]() |
9a5d06239f | ||
![]() |
a1fad471a9 | ||
![]() |
57a97365b6 | ||
![]() |
cce36906b1 | ||
![]() |
da99ba114b | ||
![]() |
9a66bd3c34 | ||
![]() |
0d23b33c0e | ||
![]() |
e721092c2a | ||
![]() |
8c7afc5646 | ||
![]() |
7752d83781 | ||
![]() |
4b45e94beb | ||
![]() |
2a4ec13c8e | ||
![]() |
20671c718e | ||
![]() |
e73db49ed0 | ||
![]() |
0553824df2 | ||
![]() |
71bc2c5944 | ||
![]() |
05feadbb7a | ||
![]() |
a4709b1175 | ||
![]() |
3a031084f3 | ||
![]() |
0c517e5351 | ||
![]() |
5fe435048b | ||
![]() |
a722bfd099 | ||
![]() |
f3d99a5fdb | ||
![]() |
79de0989d5 | ||
![]() |
ca334770b7 | ||
![]() |
1071357505 | ||
![]() |
f32dfe0278 | ||
![]() |
278cedf3d0 | ||
![]() |
45a6b5a436 | ||
![]() |
611707a3d1 | ||
![]() |
b4d20d9b9a | ||
![]() |
ecc4553e67 | ||
![]() |
ef790ca6f4 | ||
![]() |
2d88638da7 | ||
![]() |
91ba0bd0af | ||
![]() |
0e2e5f3413 | ||
![]() |
7a99dcf693 | ||
![]() |
4e78ca5d82 | ||
![]() |
83de38e56f | ||
![]() |
f4be2e4fe7 | ||
![]() |
16b0f7f9ee | ||
![]() |
27721aef71 | ||
![]() |
329a317fdf | ||
![]() |
daad634894 | ||
![]() |
4444925dea | ||
![]() |
9c1ae96d33 | ||
![]() |
b1b6d50af6 | ||
![]() |
4c697ab50e | ||
![]() |
7450088674 | ||
![]() |
b141671d90 | ||
![]() |
2ab2d9127d | ||
![]() |
278453451e | ||
![]() |
91ee7972d2 | ||
![]() |
d1f59a6590 | ||
![]() |
cdecf8904e | ||
![]() |
3d16266c69 | ||
![]() |
191676b011 | ||
![]() |
ea07b261ad | ||
![]() |
e86f737320 | ||
![]() |
9a8562c624 | ||
![]() |
145c41f462 | ||
![]() |
1d38367e79 | ||
![]() |
f58c2d0a7b | ||
![]() |
ca7a6fe1f1 | ||
![]() |
95042f73c7 | ||
![]() |
678bcb171a | ||
![]() |
8da7e505c0 | ||
![]() |
72ce4405d5 | ||
![]() |
d8e3d91a79 | ||
![]() |
edaaedae36 | ||
![]() |
da8246d8c3 | ||
![]() |
5243ae80b4 | ||
![]() |
3aca576a0d | ||
![]() |
0bb9d91eae | ||
![]() |
8825d6b15f | ||
![]() |
1f73ca21bf | ||
![]() |
2db0854eef | ||
![]() |
f66e589312 | ||
![]() |
5c9ad3068b | ||
![]() |
d07b786da6 | ||
![]() |
da5d32ed89 | ||
![]() |
55dadea98e | ||
![]() |
77fbbe95ff | ||
![]() |
1aeb95396b | ||
![]() |
48dfbbebc6 | ||
![]() |
ccf3a9f3b2 | ||
![]() |
c0cb97bd42 | ||
![]() |
8efb97ef4e | ||
![]() |
d8cda7fc1b | ||
![]() |
ee9f1e7b70 | ||
![]() |
92dd70098c | ||
![]() |
4bea4c69a4 | ||
![]() |
7734325b71 | ||
![]() |
186ae844bc | ||
![]() |
c9bdf1c184 | ||
![]() |
13ffe468df | ||
![]() |
a090cf7a10 | ||
![]() |
b7250477b5 | ||
![]() |
dfd16c5187 | ||
![]() |
4afd6b78af | ||
![]() |
398f6e5b0c | ||
![]() |
d7f7d839f8 | ||
![]() |
49a843dcdd | ||
![]() |
ec045e81f2 | ||
![]() |
d8a7828cb5 | ||
![]() |
e32cb12ad7 | ||
![]() |
ee2847cfea | ||
![]() |
22e00a7080 | ||
![]() |
cbe567069f | ||
![]() |
53baed0389 | ||
![]() |
39cb9589c8 | ||
![]() |
bde03f3574 | ||
![]() |
8f31d150fd | ||
![]() |
453dbbb031 | ||
![]() |
421754fff6 | ||
![]() |
05ec5feacf | ||
![]() |
639d9b27c8 | ||
![]() |
2c99d027f3 | ||
![]() |
4f176682dc | ||
![]() |
13ef41bd42 | ||
![]() |
3c6ba80323 | ||
![]() |
fcfa8dfac2 | ||
![]() |
5b73e9aee6 | ||
![]() |
9ead264300 | ||
![]() |
a617eda321 | ||
![]() |
c913fa65b2 | ||
![]() |
497c8c84e5 | ||
![]() |
c58a94d497 | ||
![]() |
6a853d1fa2 | ||
![]() |
c30c58e564 | ||
![]() |
3b7a4c6b6b | ||
![]() |
978cdf2514 | ||
![]() |
33609616aa | ||
![]() |
006f6c998d | ||
![]() |
05fd69eae4 | ||
![]() |
b6b8719efa | ||
![]() |
7e8b9549a1 | ||
![]() |
032f78e0f5 | ||
![]() |
4059bc9ec6 | ||
![]() |
b85cd0925a | ||
![]() |
f20254217f | ||
![]() |
9424b763ca | ||
![]() |
08ae3f8771 | ||
![]() |
68f0cf419b | ||
![]() |
26b12512b1 | ||
![]() |
b98afadd5c | ||
![]() |
499bd552a1 | ||
![]() |
0090e27699 | ||
![]() |
e568b3000e | ||
![]() |
7ae8b46ea7 | ||
![]() |
ffb903841b | ||
![]() |
72ee904e67 | ||
![]() |
222e1968d8 | ||
![]() |
1df517afd3 | ||
![]() |
cc4cea1a41 | ||
![]() |
e8868d7ebf | ||
![]() |
7d9a9033f9 | ||
![]() |
87322d7732 | ||
![]() |
08c3d6e84b | ||
![]() |
b50325c3a3 | ||
![]() |
12cdcf7681 | ||
![]() |
34192349be | ||
![]() |
153d0bb12a | ||
![]() |
20092dadad | ||
![]() |
6844f8f2bf | ||
![]() |
58f2c6a5fc | ||
![]() |
5f10d86f04 | ||
![]() |
1120e823ed | ||
![]() |
301b384a02 | ||
![]() |
e4a26164de | ||
![]() |
56d3e8893f | ||
![]() |
99336908f0 | ||
![]() |
090325af35 | ||
![]() |
485be6c3fd | ||
![]() |
faa9d36c34 | ||
![]() |
ea8e108cdf | ||
![]() |
100f5422f6 | ||
![]() |
15c716e53b | ||
![]() |
75e77c5e54 | ||
![]() |
de4fdc07e0 | ||
![]() |
51edb2fa14 | ||
![]() |
bd995089a8 | ||
![]() |
a90dd2ad1e | ||
![]() |
cd44151e16 | ||
![]() |
5715ba1a9a | ||
![]() |
6f4a1c1751 | ||
![]() |
2a1b1eb1a4 | ||
![]() |
20c597b1d7 | ||
![]() |
9dc1989507 | ||
![]() |
681cb1b978 | ||
![]() |
332a9fac5a | ||
![]() |
d118f4a3f0 | ||
![]() |
7620cd02f0 | ||
![]() |
fdfd7bd82a | ||
![]() |
01c17e10cc | ||
![]() |
bed03b301b | ||
![]() |
ef48762da5 | ||
![]() |
7978f3f0e6 | ||
![]() |
8d5fad72bf | ||
![]() |
04db521851 | ||
![]() |
26d27be161 | ||
![]() |
1259d06302 | ||
![]() |
4db3f366ef | ||
![]() |
1c52c5b673 | ||
![]() |
a6a885d4f4 | ||
![]() |
b11066cef1 | ||
![]() |
1941b0c3ed | ||
![]() |
9a47fc747f | ||
![]() |
a425d3a55b | ||
![]() |
f03a0f6e73 | ||
![]() |
74d5724092 | ||
![]() |
3df21fcaa3 | ||
![]() |
fbb5de6740 | ||
![]() |
f7f9096c6e | ||
![]() |
d5a8e1725d | ||
![]() |
34413747d5 | ||
![]() |
18de626919 | ||
![]() |
bef192084f | ||
![]() |
54eef16bfb | ||
![]() |
7be49dba69 | ||
![]() |
218a6af62a | ||
![]() |
f0315d5c70 | ||
![]() |
f82e04201b | ||
![]() |
f41231f017 | ||
![]() |
ae1fb76d13 | ||
![]() |
fa0023223f | ||
![]() |
1c80fd17fd | ||
![]() |
48cb347198 | ||
![]() |
ffd583ed11 | ||
![]() |
97bbca3aef | ||
![]() |
06fd92fd27 | ||
![]() |
c0e05a7572 | ||
![]() |
2abe6eec84 | ||
![]() |
0b6e73840a | ||
![]() |
1707fe8990 | ||
![]() |
9335b0779c | ||
![]() |
277b521fad | ||
![]() |
af1b634d6d | ||
![]() |
cc19008961 | ||
![]() |
be304e37b4 | ||
![]() |
0a34a4a7ad | ||
![]() |
898564c8d8 | ||
![]() |
708638b97f | ||
![]() |
458e857956 | ||
![]() |
ac62bcb7ba | ||
![]() |
6b1c50b051 | ||
![]() |
d4a5376f73 | ||
![]() |
71b34aa3bd | ||
![]() |
6b4d8b18e0 | ||
![]() |
66b2013d23 | ||
![]() |
dc86993c84 | ||
![]() |
00a5c13001 | ||
![]() |
0e34923114 | ||
![]() |
011164bc32 | ||
![]() |
0a43ce9ced | ||
![]() |
d925fb38ce | ||
![]() |
3dc617277f | ||
![]() |
096af09fc4 | ||
![]() |
f97f9b857b | ||
![]() |
5c0829b052 | ||
![]() |
aa999b34e2 | ||
![]() |
0a06c291e2 | ||
![]() |
4bbaf5f89c | ||
![]() |
f88e070455 | ||
![]() |
5c980c31be | ||
![]() |
9eee37bc68 | ||
![]() |
a4927477fb | ||
![]() |
d0a6c6a2f3 | ||
![]() |
92757c5d8c | ||
![]() |
0cc7765f2b | ||
![]() |
ee1ef4ff56 | ||
![]() |
b5eed5e043 | ||
![]() |
62a253f571 | ||
![]() |
080a23dd8c | ||
![]() |
c90129957e | ||
![]() |
0fa717fe11 | ||
![]() |
e72766a5bf | ||
![]() |
104a684514 | ||
![]() |
5a809d7e31 | ||
![]() |
ce3f6837e9 | ||
![]() |
3ffd2a745b | ||
![]() |
6b9c07b809 | ||
![]() |
d4e2722586 | ||
![]() |
1343767295 | ||
![]() |
4b4bfc052f | ||
![]() |
f7539eb931 | ||
![]() |
efcfecca10 | ||
![]() |
6a3735822d | ||
![]() |
5bbcc7f2f7 | ||
![]() |
400f1d37bf | ||
![]() |
985b774378 | ||
![]() |
14cbcb4af6 | ||
![]() |
18ce86407d | ||
![]() |
d0ee203265 | ||
![]() |
fc26fe0ac0 | ||
![]() |
0e0cbe3517 | ||
![]() |
8e2cb6d416 | ||
![]() |
73a6e68e03 | ||
![]() |
48f9cb09af | ||
![]() |
1c83f489d1 | ||
![]() |
f6d78a0044 | ||
![]() |
e60a7df9a2 | ||
![]() |
feaf2da834 | ||
![]() |
bf8703deae | ||
![]() |
1997b7b2d9 | ||
![]() |
666b938550 | ||
![]() |
4e8f546502 | ||
![]() |
5e9f3586cd | ||
![]() |
69ef26dab0 | ||
![]() |
163231d307 | ||
![]() |
e530750fc6 | ||
![]() |
c3997c9f26 | ||
![]() |
6d7defa79e | ||
![]() |
bc232582df | ||
![]() |
286affea38 | ||
![]() |
5f5c9e2eb3 | ||
![]() |
de5eaf1c2c | ||
![]() |
6f3755684e | ||
![]() |
6950daca9a | ||
![]() |
b4de83e348 | ||
![]() |
83a1a32a5e | ||
![]() |
3cea4804f8 | ||
![]() |
8ce32003d7 | ||
![]() |
d3191490d9 | ||
![]() |
f6d5ba56b1 | ||
![]() |
998ca64c1e | ||
![]() |
eaa33744a6 | ||
![]() |
c0a47ca999 | ||
![]() |
22bfab840d | ||
![]() |
88b7f8ac1e | ||
![]() |
6fbe4404f5 | ||
![]() |
d5e340d0f6 | ||
![]() |
94954eeba3 | ||
![]() |
b2911b2eba | ||
![]() |
c398f22e76 | ||
![]() |
063f6c1d5a | ||
![]() |
2ca691d3b8 | ||
![]() |
f2086b3a90 | ||
![]() |
bb15b744c8 | ||
![]() |
f1e99de59a | ||
![]() |
e0999c7ba4 | ||
![]() |
89c5aac9ed | ||
![]() |
4a7c9a6050 | ||
![]() |
e94ce3102e | ||
![]() |
0225faddbb | ||
![]() |
fb10d3a5be | ||
![]() |
0613e3ab12 | ||
![]() |
b21edde1bc | ||
![]() |
1953a8ecb7 | ||
![]() |
c84b592543 | ||
![]() |
53f905b88b | ||
![]() |
d057a5076c | ||
![]() |
bcb9c6ccb0 | ||
![]() |
96f86adfb8 | ||
![]() |
de89f75707 | ||
![]() |
b2307d911e | ||
![]() |
35a558ec01 | ||
![]() |
2e97c0a5fb | ||
![]() |
7d9575b7fd | ||
![]() |
321e0ced2a | ||
![]() |
a697eb8530 | ||
![]() |
4b37c1963b | ||
![]() |
8f2687e390 | ||
![]() |
c0f799a807 | ||
![]() |
abc5bd98b4 | ||
![]() |
a51893c849 | ||
![]() |
79e218d00a | ||
![]() |
9ae20a6bec | ||
![]() |
2f739ff0b3 | ||
![]() |
e0a8f4df0d | ||
![]() |
347d7c07ef | ||
![]() |
42e3cf821b | ||
![]() |
574f1be067 | ||
![]() |
cbde0e0286 | ||
![]() |
11012c6be1 | ||
![]() |
432bd83188 | ||
![]() |
f8adfa9873 | ||
![]() |
1efd226f75 | ||
![]() |
ba1bb95935 | ||
![]() |
f5e740f2ec | ||
![]() |
98be7b227f | ||
![]() |
f07cfd4f51 | ||
![]() |
029d81122b | ||
![]() |
26eb7ecdb5 | ||
![]() |
3cf3ea4e20 | ||
![]() |
84a2937d31 | ||
![]() |
1d1f3bde96 | ||
![]() |
329d81ec64 | ||
![]() |
c725f50f07 | ||
![]() |
463560c929 | ||
![]() |
c7412deb77 | ||
![]() |
9ed6c78466 | ||
![]() |
9a1bd9637c | ||
![]() |
87fd18bee1 | ||
![]() |
f3dced3199 | ||
![]() |
b36989e8e4 | ||
![]() |
95c45b90a8 | ||
![]() |
0bdc132dcf | ||
![]() |
e450391d9b | ||
![]() |
38f4bf4e28 | ||
![]() |
109b0ea78b | ||
![]() |
5061feb7bc | ||
![]() |
52480e0bc4 | ||
![]() |
109dd45b56 | ||
![]() |
c8da513d97 | ||
![]() |
06e7e40b15 | ||
![]() |
f3d6fcb52b | ||
![]() |
2decae6586 | ||
![]() |
6766041328 | ||
![]() |
2d96cec464 | ||
![]() |
843d229c3d | ||
![]() |
3d5bcd9d75 | ||
![]() |
b391dd6a3f | ||
![]() |
c126cf2bc1 | ||
![]() |
fcb5beb617 | ||
![]() |
dad141726c | ||
![]() |
48637188a7 | ||
![]() |
5d063e8449 | ||
![]() |
2b8a03e93c | ||
![]() |
213665cea2 | ||
![]() |
d230431227 | ||
![]() |
2040c9fe41 | ||
![]() |
b49821a9f4 | ||
![]() |
1701b0afed | ||
![]() |
7785431152 | ||
![]() |
b325233b2d | ||
![]() |
3baf29a6b7 | ||
![]() |
2510216f21 | ||
![]() |
e4d7ae9718 | ||
![]() |
596fabda5d | ||
![]() |
e95f34ae13 | ||
![]() |
169cb9424f | ||
![]() |
536576518e | ||
![]() |
b034e972b0 | ||
![]() |
1bf36b6461 | ||
![]() |
58ab269d3d | ||
![]() |
7cbb73be7a | ||
![]() |
cdbf81c51c | ||
![]() |
cc54368658 | ||
![]() |
d81e4dbe99 | ||
![]() |
4fd075aafa | ||
![]() |
a789649d97 | ||
![]() |
1817d014ef | ||
![]() |
8f1f0f5475 | ||
![]() |
b406a2430d | ||
![]() |
431edb0e67 | ||
![]() |
5b96944940 | ||
![]() |
8a6aaf4e2d | ||
![]() |
b30c4275ef | ||
![]() |
7f7ec625c8 | ||
![]() |
010f1f2bd1 | ||
![]() |
ce32089cc4 | ||
![]() |
f0ee2e14fb | ||
![]() |
9b8a490061 | ||
![]() |
f4e8ab27fa | ||
![]() |
4da49c8d59 | ||
![]() |
3960093231 | ||
![]() |
3003bdd507 | ||
![]() |
7909b30b4b | ||
![]() |
915382f2c7 | ||
![]() |
905a34188d | ||
![]() |
c907d690b7 | ||
![]() |
261cab8450 | ||
![]() |
7c2137fcda | ||
![]() |
2c3cb7f516 | ||
![]() |
dd4d903f69 | ||
![]() |
a823b8f70c | ||
![]() |
3d7aa7a4b9 | ||
![]() |
3e8bff03e7 | ||
![]() |
584082a1df | ||
![]() |
49644ce18a | ||
![]() |
7432ef9e19 | ||
![]() |
ca40e39da4 | ||
![]() |
7e271129c7 | ||
![]() |
b5f95a351c | ||
![]() |
5e10befe28 | ||
![]() |
98ebb095cc | ||
![]() |
c3b5b47b22 | ||
![]() |
3c133c36bd | ||
![]() |
9ceb2e6084 | ||
![]() |
2b322b638e | ||
![]() |
0d298d743a | ||
![]() |
ec5d80585d | ||
![]() |
88e71e12b1 | ||
![]() |
63bb72caab | ||
![]() |
240c7913e9 | ||
![]() |
582629a996 | ||
![]() |
939dd17910 | ||
![]() |
d58b1a7de7 | ||
![]() |
a938895e5e | ||
![]() |
9d9111e4d8 | ||
![]() |
79dbfce36f | ||
![]() |
e7f162e5e5 | ||
![]() |
f23d10f000 | ||
![]() |
67dd86988b | ||
![]() |
e16ab324f4 | ||
![]() |
317926f379 | ||
![]() |
fe61ca2c97 | ||
![]() |
2940686fba | ||
![]() |
2ef2a561ef | ||
![]() |
7ae31def4f | ||
![]() |
8f038d7d26 | ||
![]() |
b9a2652013 | ||
![]() |
dce4166bc8 | ||
![]() |
3c5f509bc7 | ||
![]() |
182cea3385 | ||
![]() |
0ba1ba55bd | ||
![]() |
45440eec1d | ||
![]() |
49fad14920 | ||
![]() |
0f1e31643d | ||
![]() |
c17c291b1c | ||
![]() |
666641f990 | ||
![]() |
268f99f8ac | ||
![]() |
216f048466 | ||
![]() |
ed8d5f1a80 | ||
![]() |
27a1ef25a5 | ||
![]() |
ecbc8165d5 | ||
![]() |
128594584e | ||
![]() |
8b798013c1 | ||
![]() |
8c19c2c2e9 | ||
![]() |
6f0fee4c43 | ||
![]() |
a5b4a7caad | ||
![]() |
c1b9db19c6 | ||
![]() |
40f88faf37 | ||
![]() |
81e6228ab3 | ||
![]() |
157951343b | ||
![]() |
9325eff6fc | ||
![]() |
8c8f366e0f | ||
![]() |
542221a38d | ||
![]() |
aa7faaaa72 | ||
![]() |
c96db31006 | ||
![]() |
c66de5931c | ||
![]() |
fdec13ef81 | ||
![]() |
9aee44f363 | ||
![]() |
2caa2d5b32 | ||
![]() |
66c7f44bea | ||
![]() |
99e4a79cb8 | ||
![]() |
f7f0df60ec | ||
![]() |
e012262301 | ||
![]() |
5676155e4e | ||
![]() |
d98bfa5bed | ||
![]() |
7ccac8053b | ||
![]() |
40cf46fe7d | ||
![]() |
f5c05e1283 | ||
![]() |
330e47f0b7 | ||
![]() |
1e9378b429 | ||
![]() |
af58fb5fa3 | ||
![]() |
3d6fb2383a | ||
![]() |
2407798d2e | ||
![]() |
f9194bd28c | ||
![]() |
816f020cc3 | ||
![]() |
9191aa32df | ||
![]() |
0a6395507e | ||
![]() |
eff68c601c | ||
![]() |
3f5540f35b | ||
![]() |
e5f5030e9c | ||
![]() |
df39d37ca9 | ||
![]() |
bea81d0449 | ||
![]() |
d261845fc6 | ||
![]() |
eac0b295d2 | ||
![]() |
f4eefcea13 | ||
![]() |
268715711f | ||
![]() |
c5f70b4401 | ||
![]() |
912caf84a6 | ||
![]() |
448fa4d35f | ||
![]() |
cd8d4c357d | ||
![]() |
bdcc6f861d | ||
![]() |
d9d6b7b151 | ||
![]() |
4517692c20 | ||
![]() |
609b9e3369 | ||
![]() |
be8ca110f4 | ||
![]() |
f1da37dd12 | ||
![]() |
ecca51dbdd | ||
![]() |
d19015579c | ||
![]() |
6179ca5668 | ||
![]() |
6c70db31bd | ||
![]() |
b0790d7010 | ||
![]() |
eb8158673f | ||
![]() |
409f17980f | ||
![]() |
654ef06682 | ||
![]() |
bca95a4972 | ||
![]() |
4df065d8d5 | ||
![]() |
a358477cda | ||
![]() |
44d6db8c47 | ||
![]() |
49e627d2fd | ||
![]() |
c3a8d93eb4 | ||
![]() |
7a648c6465 | ||
![]() |
5c1c5b5b0d | ||
![]() |
811da4bac5 | ||
![]() |
23b2fbef45 | ||
![]() |
f1b52b495a | ||
![]() |
0bff3891bd | ||
![]() |
a7b1658ee1 | ||
![]() |
c274bbcddf | ||
![]() |
57f32e5360 | ||
![]() |
b4ecf8e28e | ||
![]() |
06cfba8c7e | ||
![]() |
5603834282 | ||
![]() |
33134d4529 | ||
![]() |
bf57b6e4a2 | ||
![]() |
b63f87a5b5 | ||
![]() |
68e3612d36 | ||
![]() |
616a826b8a | ||
![]() |
09d62d76b2 | ||
![]() |
deab7794d6 | ||
![]() |
bc892059a1 | ||
![]() |
6d0fdc7510 | ||
![]() |
dd1d4b86d2 | ||
![]() |
65eeea9453 | ||
![]() |
b2b202586c | ||
![]() |
49341260dc | ||
![]() |
dfcef81001 | ||
![]() |
a834a6c874 | ||
![]() |
ad5188a280 | ||
![]() |
53c2a0c724 | ||
![]() |
3a8cc31f1b | ||
![]() |
0f6b452d17 | ||
![]() |
e994bcf737 | ||
![]() |
5432700b0d | ||
![]() |
7b7534c952 | ||
![]() |
b4570545ef | ||
![]() |
6478db13e6 | ||
![]() |
1d7ddcc10d | ||
![]() |
30508c6c2c | ||
![]() |
18f43c5757 | ||
![]() |
52498efd14 | ||
![]() |
40cb721d16 | ||
![]() |
1c2699b16e | ||
![]() |
d98a016087 | ||
![]() |
bc5b6db031 | ||
![]() |
4a4f252ad8 | ||
![]() |
c81dd1a478 | ||
![]() |
31016156be | ||
![]() |
dad352f05e | ||
![]() |
95e94618d8 | ||
![]() |
045a401cd7 | ||
![]() |
d5d7e2edbc | ||
![]() |
64e1a6ec7e | ||
![]() |
2221b425ad | ||
![]() |
86b2ccae94 | ||
![]() |
0bd67a54ab | ||
![]() |
306f254218 | ||
![]() |
449add88a6 | ||
![]() |
941ba8d689 | ||
![]() |
813335f8eb | ||
![]() |
f0cef2f42f | ||
![]() |
e767eb38f4 | ||
![]() |
71cbef4c13 | ||
![]() |
834ad1ef84 | ||
![]() |
753e6661bc | ||
![]() |
1ce19f5444 | ||
![]() |
0de1230a1a | ||
![]() |
5ff304324d | ||
![]() |
37f7ef41f2 | ||
![]() |
4022284059 | ||
![]() |
dde6a2eb7d | ||
![]() |
1083ed4e40 | ||
![]() |
c111825b1e | ||
![]() |
e36c22d29a | ||
![]() |
c192931015 | ||
![]() |
ae895a4aec | ||
![]() |
301ad7e07d | ||
![]() |
cc93616019 | ||
![]() |
493d9875d4 | ||
![]() |
3fb8d05f0d | ||
![]() |
88066563d3 | ||
![]() |
9e3590352c | ||
![]() |
86ad52639f | ||
![]() |
c00be946a5 | ||
![]() |
322aeeb552 | ||
![]() |
501d4cafa9 | ||
![]() |
e556fb3e3a | ||
![]() |
bb930297d6 | ||
![]() |
83f10167e5 | ||
![]() |
d47bb21389 | ||
![]() |
4dedff00b8 | ||
![]() |
2414dad656 | ||
![]() |
8f98cb4860 | ||
![]() |
6e96b7e00a | ||
![]() |
04acce4916 | ||
![]() |
1b7a304149 | ||
![]() |
00287b27ab | ||
![]() |
8f18b7fd6c | ||
![]() |
dde7771dc6 | ||
![]() |
4165184e42 | ||
![]() |
f3c8211ba4 | ||
![]() |
e4953a756a | ||
![]() |
471ac63a3a | ||
![]() |
f358eda5c5 | ||
![]() |
035130ecdc | ||
![]() |
908fc8573a | ||
![]() |
ca0e86757b | ||
![]() |
d4153607c9 | ||
![]() |
99d0a0845d | ||
![]() |
5fae5a9ee0 | ||
![]() |
b5a75be1db | ||
![]() |
93ad5433e4 | ||
![]() |
eb5e0e0b9b | ||
![]() |
5ead8de0bb | ||
![]() |
dc90f58391 | ||
![]() |
ca43c71cf5 | ||
![]() |
931a311c48 | ||
![]() |
9f9d7da1ce | ||
![]() |
d00e8d3b0f | ||
![]() |
d4124bae0c | ||
![]() |
c062eb751e | ||
![]() |
e273a594ba | ||
![]() |
ebb724e687 | ||
![]() |
58eb2d6f63 | ||
![]() |
b14bf8a96d | ||
![]() |
01987f1b51 | ||
![]() |
0a35358e8d | ||
![]() |
5bacb85c33 | ||
![]() |
6933ac523f | ||
![]() |
ba9120b417 | ||
![]() |
3e71f5810f | ||
![]() |
296a0edae4 | ||
![]() |
cdf5602dfb | ||
![]() |
e214f719c9 | ||
![]() |
08fbcf5158 | ||
![]() |
10ca515ac5 | ||
![]() |
e59a14852b | ||
![]() |
c696b4f2f2 | ||
![]() |
553153ba92 | ||
![]() |
9d2bcf807e | ||
![]() |
422ac9befe | ||
![]() |
793f641af6 | ||
![]() |
0ea5f5d584 | ||
![]() |
c024b846c3 | ||
![]() |
37b3fde4e1 | ||
![]() |
e89ef5de25 | ||
![]() |
06cac44d02 | ||
![]() |
488fe28ad3 | ||
![]() |
50f474ae92 | ||
![]() |
78ca2ffaba | ||
![]() |
911f5bc78e | ||
![]() |
b227427916 | ||
![]() |
b5f77fd6e7 | ||
![]() |
4fe966f534 | ||
![]() |
bcce0838dd | ||
![]() |
76e43bcb89 | ||
![]() |
c666be32f4 | ||
![]() |
2d850795d8 | ||
![]() |
784982718e | ||
![]() |
2f8d263c9c | ||
![]() |
b214163af3 | ||
![]() |
a4fe1000c2 | ||
![]() |
9a275fa4ed | ||
![]() |
f2c83f51de | ||
![]() |
fb76b72787 | ||
![]() |
bec6c4511c | ||
![]() |
77b9988d05 | ||
![]() |
3e49f93816 | ||
![]() |
32f6932faf | ||
![]() |
db76e1d65f | ||
![]() |
0136ba504b | ||
![]() |
459e026f16 | ||
![]() |
b03a723c3e | ||
![]() |
7562636151 | ||
![]() |
3acc65ca0d | ||
![]() |
fde0f4ca0a | ||
![]() |
73cab2af2d | ||
![]() |
94d2198b30 | ||
![]() |
88a67c8703 | ||
![]() |
ccf9b1291e | ||
![]() |
47dae716ae | ||
![]() |
ea26e1c72f | ||
![]() |
e6d79f0673 | ||
![]() |
bf7002d0ae | ||
![]() |
cbae145da5 | ||
![]() |
a728502988 | ||
![]() |
9419f74d13 | ||
![]() |
50523c1566 | ||
![]() |
e9ef3e270d | ||
![]() |
9078f7beef | ||
![]() |
ac2c5abb09 | ||
![]() |
a02235eb2b | ||
![]() |
7658761940 | ||
![]() |
3f03f076cf | ||
![]() |
141e6de88f | ||
![]() |
88bbfe5961 | ||
![]() |
d60569b6d6 | ||
![]() |
91165e80ba | ||
![]() |
a15f9552eb | ||
![]() |
501d225f93 | ||
![]() |
4942f244da | ||
![]() |
1be5c9af56 | ||
![]() |
fab7abb85b | ||
![]() |
818d383f2e | ||
![]() |
5fffa32630 | ||
![]() |
19d5feb483 | ||
![]() |
199fc6be94 | ||
![]() |
865729c033 | ||
![]() |
6aa9071e24 | ||
![]() |
6db3fc2eea | ||
![]() |
65ffaaa67c | ||
![]() |
440467e304 | ||
![]() |
07f2c9c1c6 | ||
![]() |
74422dd000 | ||
![]() |
78d663bbb4 | ||
![]() |
6cc2fa5306 | ||
![]() |
db0a58ea04 | ||
![]() |
eba1e69e64 | ||
![]() |
a4e7877033 | ||
![]() |
c62260ab02 | ||
![]() |
b58550bb79 | ||
![]() |
d02c7df75c | ||
![]() |
6dbebf4806 | ||
![]() |
1019660f6a | ||
![]() |
bfd11060ec | ||
![]() |
7b6dccf5ef | ||
![]() |
d76eccad1c | ||
![]() |
6f0ac7ae45 | ||
![]() |
8f17ed1eb9 | ||
![]() |
9737e4a24d | ||
![]() |
805c17565b | ||
![]() |
8e9d1cdd18 | ||
![]() |
1d3300fb34 | ||
![]() |
26d93cf3be | ||
![]() |
4e3183ee65 | ||
![]() |
8370ec58c1 | ||
![]() |
0d1bcd3c13 | ||
![]() |
a456b968af | ||
![]() |
2b5562e376 | ||
![]() |
3a242dc296 | ||
![]() |
5aff9b6fdb | ||
![]() |
35b3216fee | ||
![]() |
fac6fe0c2e | ||
![]() |
6f8020e30d | ||
![]() |
efbc2a5715 | ||
![]() |
0bee3901b6 | ||
![]() |
7106c68032 | ||
![]() |
be707536bd | ||
![]() |
d18eebf97d | ||
![]() |
60cc7afc72 | ||
![]() |
b7949d2e69 | ||
![]() |
48175d5b8e | ||
![]() |
cb8fd6597d | ||
![]() |
4754ac2bd1 | ||
![]() |
3cca77e748 | ||
![]() |
4ec1aaabe6 | ||
![]() |
0baacbef98 | ||
![]() |
70bc70ec97 | ||
![]() |
be2b59431f | ||
![]() |
c43193b17d | ||
![]() |
2147af0e6a | ||
![]() |
d8261b3359 | ||
![]() |
ed2524cbbb | ||
![]() |
76a4ae2aae | ||
![]() |
2666e70706 | ||
![]() |
19cf66be0f | ||
![]() |
f3d1bd25ad | ||
![]() |
3ed6d4bc7a | ||
![]() |
1ecb26a3fb | ||
![]() |
8b26ddcd2c | ||
![]() |
47786dcb8c | ||
![]() |
5d91d6a885 | ||
![]() |
559f7c2683 | ||
![]() |
3b4da70c85 | ||
![]() |
95199bd325 | ||
![]() |
a8887b211e | ||
![]() |
15488fbfd0 | ||
![]() |
9a758fc3dc | ||
![]() |
edc9c3f01c | ||
![]() |
83d769251d | ||
![]() |
bd66333147 | ||
![]() |
2228678520 | ||
![]() |
9a042118f9 | ||
![]() |
787ee454ff | ||
![]() |
32a4587bd3 | ||
![]() |
474050ba6b | ||
![]() |
34657fd675 | ||
![]() |
18747db17f | ||
![]() |
dccea9434a | ||
![]() |
85bf92ad2f | ||
![]() |
40b07572a9 | ||
![]() |
94afd4f1e2 | ||
![]() |
1218944680 | ||
![]() |
9147e3a0bd | ||
![]() |
ed25212654 | ||
![]() |
ce5fe61e67 | ||
![]() |
9e9266b92a | ||
![]() |
e329d72e8b | ||
![]() |
92ec3fc060 | ||
![]() |
3afb3a905c | ||
![]() |
b3e6f04b30 | ||
![]() |
0a29f51862 | ||
![]() |
7149d407cd | ||
![]() |
fbb17df916 | ||
![]() |
34b317da7a | ||
![]() |
c161829803 | ||
![]() |
eda69dc881 | ||
![]() |
500e5c41ff | ||
![]() |
7c2ae129d7 | ||
![]() |
c49afa7caa | ||
![]() |
534c157809 | ||
![]() |
cae4d3fae3 | ||
![]() |
f59500c809 | ||
![]() |
b75bf255ba | ||
![]() |
1588242876 | ||
![]() |
00eff651e6 | ||
![]() |
a83d6a691a | ||
![]() |
5065f2cb80 | ||
![]() |
a4b36b041a | ||
![]() |
c360d9fa18 | ||
![]() |
78258eb9cb | ||
![]() |
71e5b5cf72 | ||
![]() |
4b0229584e | ||
![]() |
b2df8297e1 | ||
![]() |
95048b14fd | ||
![]() |
3360791a31 | ||
![]() |
4c65ecbe89 | ||
![]() |
65a56a1da0 | ||
![]() |
645f9b2ee2 | ||
![]() |
117157f02c | ||
![]() |
acab197a45 | ||
![]() |
feed7cd556 | ||
![]() |
c79b70fffd | ||
![]() |
42a5709202 | ||
![]() |
806f1ec0a3 | ||
![]() |
ccee85a05e | ||
![]() |
a27fb173dd | ||
![]() |
e4885badfc | ||
![]() |
62c488aff6 | ||
![]() |
afcb5fe3cf | ||
![]() |
d38bed1334 | ||
![]() |
f6a9d5b038 | ||
![]() |
863258f23d | ||
![]() |
3b76fa3f92 | ||
![]() |
023a42fa07 | ||
![]() |
537515432c | ||
![]() |
f93783052b | ||
![]() |
991bc1a1ce | ||
![]() |
1b5c557c44 | ||
![]() |
5794faef6c | ||
![]() |
ec01c436ea | ||
![]() |
9bb5568d8e | ||
![]() |
c25a107c04 | ||
![]() |
8f152aac69 | ||
![]() |
1cd5de697e | ||
![]() |
ce8c812669 | ||
![]() |
2c3ac053d8 | ||
![]() |
ca3bb6a540 | ||
![]() |
0932f095e1 | ||
![]() |
3d3807ce41 | ||
![]() |
30834eb8ff | ||
![]() |
1c4e74920a | ||
![]() |
e6efff426a | ||
![]() |
cacf60fdd9 | ||
![]() |
352f94ff2b | ||
![]() |
47530d274f | ||
![]() |
be13ec822f | ||
![]() |
c6da0cc9a2 | ||
![]() |
07623f9883 | ||
![]() |
6e0d334a0c | ||
![]() |
e514b99c57 | ||
![]() |
cffdaefe2f | ||
![]() |
9de4ca61e8 | ||
![]() |
d7919c45a3 | ||
![]() |
d24826a58e | ||
![]() |
3dfadcc397 | ||
![]() |
98d677dc0b | ||
![]() |
4f28225188 | ||
![]() |
8059039ef4 | ||
![]() |
682f3fdc3e | ||
![]() |
6f7aaba7fa | ||
![]() |
ea7a1012b9 | ||
![]() |
f24373699e | ||
![]() |
f74e15840d | ||
![]() |
c2c8a27545 | ||
![]() |
0979f3c5c3 | ||
![]() |
6f2fd1e2da | ||
![]() |
597effc856 | ||
![]() |
30b3510fbd | ||
![]() |
0cf9b11c3e | ||
![]() |
0388ce3e5b | ||
![]() |
1ea6a14437 | ||
![]() |
e8a073d538 | ||
![]() |
c8ac686b22 | ||
![]() |
81cfd13b4a | ||
![]() |
73a3abe535 | ||
![]() |
9006dd12f3 | ||
![]() |
2bcbb89175 | ||
![]() |
39daddef34 | ||
![]() |
ff4538612e | ||
![]() |
1782d00cc5 | ||
![]() |
d4f468208e | ||
![]() |
c60b708b9c | ||
![]() |
3dee012415 | ||
![]() |
a1d1fb962b | ||
![]() |
24e02a6c5f | ||
![]() |
4f352391ae | ||
![]() |
4667e0bf88 | ||
![]() |
9544e6c757 | ||
![]() |
83b5e1a49d | ||
![]() |
4f287b5ecd | ||
![]() |
4c346be367 | ||
![]() |
c87ac6f1ad | ||
![]() |
02dc395880 | ||
![]() |
63a3e0b325 | ||
![]() |
01ae5688d7 | ||
![]() |
ec8b10c85b | ||
![]() |
19ff8339be | ||
![]() |
a440b712de | ||
![]() |
3653045922 | ||
![]() |
0d7438e398 | ||
![]() |
81b17bec69 | ||
![]() |
e43c446c38 | ||
![]() |
0cb442c6e0 | ||
![]() |
17ea079fcd | ||
![]() |
2270aefdee | ||
![]() |
30828bcbe0 | ||
![]() |
4667adc64d | ||
![]() |
de779d453c | ||
![]() |
fe959b30c6 | ||
![]() |
3fcbb17a15 | ||
![]() |
cdfde1d91f | ||
![]() |
64a5b24e12 | ||
![]() |
7d29bd216d | ||
![]() |
1abd7cc2a0 | ||
![]() |
78608d92b4 | ||
![]() |
54cbacf4f4 | ||
![]() |
b43aae84ab | ||
![]() |
9d9789953b | ||
![]() |
614eb930d3 | ||
![]() |
62094a2098 | ||
![]() |
84ed805fd7 | ||
![]() |
82a158e803 | ||
![]() |
1527ae1472 | ||
![]() |
5d1e86fdc3 | ||
![]() |
a6d2a390f0 | ||
![]() |
a47ec7c77d | ||
![]() |
ee1404d99e | ||
![]() |
2caba558d7 | ||
![]() |
5e4119f6a9 | ||
![]() |
6b46b43367 | ||
![]() |
460f10f46a | ||
![]() |
37c6201a5a | ||
![]() |
4184988c0c | ||
![]() |
3264015dac | ||
![]() |
d1c785d1d0 | ||
![]() |
abd63df75b | ||
![]() |
57f35292d5 | ||
![]() |
7bd6f4a4ea | ||
![]() |
35de8e6ad5 | ||
![]() |
1549edfd55 | ||
![]() |
f33cf6cc2e | ||
![]() |
3a8cffe3ce | ||
![]() |
58066443de | ||
![]() |
20566b6571 | ||
![]() |
af52544792 | ||
![]() |
594445f6dd | ||
![]() |
c3e86f0f21 | ||
![]() |
d80102a7a4 | ||
![]() |
e886c38b45 | ||
![]() |
33a5823801 | ||
![]() |
b220d15fff | ||
![]() |
6fb856ee70 | ||
![]() |
3922588716 | ||
![]() |
940b12a908 | ||
![]() |
db5349881d | ||
![]() |
0edfe83a23 | ||
![]() |
7abc6440f6 | ||
![]() |
168ce2111d | ||
![]() |
617d0bfee7 | ||
![]() |
0d0da0d623 | ||
![]() |
dc657f2eb0 | ||
![]() |
ef2b4a7536 | ||
![]() |
e41d75c374 | ||
![]() |
eb55f5655f | ||
![]() |
9f72bf7745 | ||
![]() |
f34202a82a | ||
![]() |
f8c8161a3e | ||
![]() |
63aafd3133 | ||
![]() |
e2303235cd | ||
![]() |
1771d18a21 | ||
![]() |
d02a0b2213 | ||
![]() |
6cbbd0c515 | ||
![]() |
cae26f4f70 | ||
![]() |
f328d1461f | ||
![]() |
d3e9799279 | ||
![]() |
14eefe1f5d | ||
![]() |
6c2cc4cf50 | ||
![]() |
440f4729ac | ||
![]() |
e1c6042189 | ||
![]() |
55a057307c | ||
![]() |
8a5176e593 | ||
![]() |
a757778fba | ||
![]() |
1259911275 | ||
![]() |
4e4035a867 | ||
![]() |
9bd87e78dc | ||
![]() |
12a1c4ccbf | ||
![]() |
888e17cb91 | ||
![]() |
0fa48cea2a | ||
![]() |
9b7d393d5d | ||
![]() |
cc702cbdfa | ||
![]() |
e8ddb0c427 | ||
![]() |
feb677e656 | ||
![]() |
c19dd81ecf | ||
![]() |
01a50a3a98 | ||
![]() |
8920a32c75 | ||
![]() |
4773d0bb7f | ||
![]() |
20a6c5e7b7 | ||
![]() |
0826e0f96b | ||
![]() |
bd374f4c36 | ||
![]() |
bac8849f2d | ||
![]() |
ad87c3c87d | ||
![]() |
4c7eb34290 | ||
![]() |
a1fd9f7310 | ||
![]() |
9433908218 | ||
![]() |
8f1c4cd9c4 | ||
![]() |
85b210ebf6 | ||
![]() |
af7ffa3878 | ||
![]() |
96a84d16a6 | ||
![]() |
0737ea30e8 | ||
![]() |
1807811903 | ||
![]() |
a33dce1948 | ||
![]() |
ef4dc1b49e | ||
![]() |
0f886be109 | ||
![]() |
73a597855f | ||
![]() |
86d9c94962 | ||
![]() |
21ec2dfa68 | ||
![]() |
296d1d1b61 | ||
![]() |
af536dfefb | ||
![]() |
7d7c5207d7 | ||
![]() |
52f0a3dfb9 | ||
![]() |
4532bed7bc | ||
![]() |
d3bccc049b | ||
![]() |
333321e600 | ||
![]() |
081f11af40 | ||
![]() |
9f0f458a02 | ||
![]() |
e74c716588 | ||
![]() |
89bd0791dc | ||
![]() |
a591614a39 | ||
![]() |
0085c8351e | ||
![]() |
4df7f92a56 | ||
![]() |
1841cefbd8 | ||
![]() |
f13ae930a5 | ||
![]() |
dbe233f0ae | ||
![]() |
17e5f6d76b | ||
![]() |
f52d167da3 | ||
![]() |
63aa7128a7 | ||
![]() |
d30a652fc1 | ||
![]() |
1cd0684f62 | ||
![]() |
bc97a13a62 | ||
![]() |
db98eed392 | ||
![]() |
76115d6a81 | ||
![]() |
9e2fd52434 | ||
![]() |
7ca12c3dc4 | ||
![]() |
3dffa09977 | ||
![]() |
7c313eed33 | ||
![]() |
383cf7f4d5 | ||
![]() |
4babf0d102 | ||
![]() |
df4567f9e4 | ||
![]() |
db2bf66eee | ||
![]() |
36eaafa52a | ||
![]() |
f4d09d46f4 | ||
![]() |
01c1b42319 | ||
![]() |
472a3a54ab | ||
![]() |
9fbf9923dc | ||
![]() |
9d3242c13e | ||
![]() |
08c6a21bbb | ||
![]() |
7d18be9928 | ||
![]() |
748fe3b89f | ||
![]() |
b383e7dd08 | ||
![]() |
33ab69e010 | ||
![]() |
c52070d554 | ||
![]() |
1d67c32351 | ||
![]() |
23d6589a73 | ||
![]() |
3ae73e7df5 | ||
![]() |
8f3f60d249 | ||
![]() |
16664789d2 | ||
![]() |
f97aae1fd8 | ||
![]() |
7e1bbecc9f | ||
![]() |
376a56df26 | ||
![]() |
c41b86eb68 | ||
![]() |
77cc7cebf9 | ||
![]() |
f61105c041 | ||
![]() |
b4dc073a48 | ||
![]() |
2324f91622 | ||
![]() |
6b5c89b6f8 | ||
![]() |
e5e94ffb6d | ||
![]() |
e3584eb61e | ||
![]() |
bb5d8497c9 | ||
![]() |
3ea6028c9f | ||
![]() |
6e901ff362 | ||
![]() |
d1032e4d7d | ||
![]() |
67efda7cf4 | ||
![]() |
0545d9bab7 | ||
![]() |
a233e5d404 | ||
![]() |
2484563ac7 | ||
![]() |
4d4ed7bc7b | ||
![]() |
e75ea257e8 | ||
![]() |
ff9df4a6ed | ||
![]() |
fdc051114b | ||
![]() |
7aa8dcb0e3 | ||
![]() |
37531ec5e0 | ||
![]() |
f8991ca8e0 | ||
![]() |
cd9c46e88e | ||
![]() |
04c28d4105 | ||
![]() |
59975087c5 | ||
![]() |
0f804937f7 | ||
![]() |
f3cda7d551 | ||
![]() |
209dba6a4a | ||
![]() |
43bc42826b | ||
![]() |
004375b478 | ||
![]() |
49a1c3f8c2 | ||
![]() |
c748394ff0 | ||
![]() |
9d246b1ad4 | ||
![]() |
038dc21c78 | ||
![]() |
7c758c0750 | ||
![]() |
0925c90047 | ||
![]() |
74b94bc66a | ||
![]() |
65e2177964 | ||
![]() |
a48c6e5b43 | ||
![]() |
b00f3102a6 | ||
![]() |
f8679a251e | ||
![]() |
9484eb9f6b | ||
![]() |
dd6c25be13 | ||
![]() |
4e917b249d | ||
![]() |
a87f96d780 | ||
![]() |
b959455ff8 | ||
![]() |
1e6c666df4 | ||
![]() |
1fd0f45a02 | ||
![]() |
93487b1aeb | ||
![]() |
0772231ac0 | ||
![]() |
37a5627b40 | ||
![]() |
a679a1966a | ||
![]() |
b21e8a77be | ||
![]() |
049a2dde96 | ||
![]() |
78a4891dfc | ||
![]() |
5c9c10a6db | ||
![]() |
16cb3ee779 | ||
![]() |
5bfadc2147 | ||
![]() |
80d05add03 | ||
![]() |
3493f4da7b | ||
![]() |
374ce47fcd | ||
![]() |
bc58e459ea | ||
![]() |
25c8fe5c0e | ||
![]() |
dad9de3be3 | ||
![]() |
f8d1fc749c | ||
![]() |
bbd4da5a27 | ||
![]() |
acc94dcde0 | ||
![]() |
72685b0330 | ||
![]() |
cffd59f9ba | ||
![]() |
c5e55be84a | ||
![]() |
03b48078df | ||
![]() |
f0908f3a58 | ||
![]() |
49e6e54e0b | ||
![]() |
cbe4e698c5 | ||
![]() |
aa39abc41a | ||
![]() |
c0ce6e28d4 | ||
![]() |
9b42c0aa50 | ||
![]() |
f7caad9af9 | ||
![]() |
992406d5a0 | ||
![]() |
60ad0bb4e2 | ||
![]() |
f23d53fe1a | ||
![]() |
12fa3c7417 | ||
![]() |
fc695896dd | ||
![]() |
13885968e3 | ||
![]() |
e331f23bb0 | ||
![]() |
d89ac59004 | ||
![]() |
5b365a15c5 | ||
![]() |
d8617746eb | ||
![]() |
1a830eb555 | ||
![]() |
bb0e858dfe | ||
![]() |
4377998232 | ||
![]() |
6852127c33 | ||
![]() |
83e40f3e87 | ||
![]() |
31d96ea85b | ||
![]() |
9f895fe65f | ||
![]() |
84f96d7a98 | ||
![]() |
a6144e9692 | ||
![]() |
88d5bdc8bf | ||
![]() |
c2fae07c11 | ||
![]() |
630cdaf08f | ||
![]() |
2309b12a31 | ||
![]() |
8dc57e8e87 | ||
![]() |
da4c6e8fb6 | ||
![]() |
a1a01870e6 | ||
![]() |
80d2f34ad0 | ||
![]() |
b5b7cf67bf | ||
![]() |
ca87b6dbfd | ||
![]() |
7cfad2d7ef | ||
![]() |
ae18b82282 | ||
![]() |
701506e569 | ||
![]() |
2f04e0de85 | ||
![]() |
a561cfe808 | ||
![]() |
9ce3708647 | ||
![]() |
9dcfda1f41 | ||
![]() |
e23859f776 | ||
![]() |
ae94b90fe4 | ||
![]() |
cbac6fc8d5 | ||
![]() |
5941c383b6 | ||
![]() |
396c3fb554 | ||
![]() |
dbdd9b93f8 | ||
![]() |
84e7e74d00 | ||
![]() |
0103e6546f | ||
![]() |
a1f09bdc4e | ||
![]() |
ba561604f0 | ||
![]() |
bad4be4cbc | ||
![]() |
2f7c351039 | ||
![]() |
4b05a18e3e | ||
![]() |
b7352e255e | ||
![]() |
1408270fb1 | ||
![]() |
cd46b7ae45 | ||
![]() |
6ad5d19cf8 | ||
![]() |
66bbf322c5 | ||
![]() |
7bdd397c78 | ||
![]() |
86069c291f | ||
![]() |
ba87146c61 | ||
![]() |
47e177a575 | ||
![]() |
b700b3d1f7 | ||
![]() |
11a377b973 | ||
![]() |
e889c2e54e | ||
![]() |
ab81439aac | ||
![]() |
75b4d46f4c | ||
![]() |
d680711ffc | ||
![]() |
ba38c2ad63 | ||
![]() |
140b179607 | ||
![]() |
49d7492e05 | ||
![]() |
c8ad4e9bcd | ||
![]() |
49bc8f95f6 | ||
![]() |
50bab47f43 | ||
![]() |
1e288100a9 | ||
![]() |
d13fdd0791 | ||
![]() |
73a8569d21 | ||
![]() |
2a47b3f1a1 | ||
![]() |
41494ee689 | ||
![]() |
8bf6fadf0e | ||
![]() |
13b2c438b9 | ||
![]() |
368f7ab0ed | ||
![]() |
8b23a5e4c1 | ||
![]() |
60a87f1541 | ||
![]() |
0827553a69 | ||
![]() |
1688af7a0d | ||
![]() |
1412daf615 | ||
![]() |
57db05f95b | ||
![]() |
1c1e7183e3 | ||
![]() |
ed48001fb2 | ||
![]() |
d864734bc5 | ||
![]() |
150740766f | ||
![]() |
848f52688b | ||
![]() |
bcab0e8579 | ||
![]() |
e6cce4069e | ||
![]() |
ddfeeb6af7 | ||
![]() |
5c2b524d07 | ||
![]() |
17d51a1cf6 | ||
![]() |
01ce6c5f6f | ||
![]() |
79aa56c898 | ||
![]() |
93e8d49540 | ||
![]() |
c5f31481b9 | ||
![]() |
56cdbc0c60 | ||
![]() |
d0a5160978 | ||
![]() |
106b55fc93 | ||
![]() |
573da8bd03 | ||
![]() |
7845cece6b | ||
![]() |
74dea42687 | ||
![]() |
0d610e364c | ||
![]() |
0476c7204f | ||
![]() |
e0683c19c8 | ||
![]() |
a7e1c1c4d0 | ||
![]() |
335b8c63e0 | ||
![]() |
f9bbb9a8c3 | ||
![]() |
fe5293ba36 | ||
![]() |
3297596f2e | ||
![]() |
5b5e8d7860 | ||
![]() |
7049d04d71 | ||
![]() |
17ddf18a67 | ||
![]() |
e60bfda93f | ||
![]() |
40dda1e0f7 | ||
![]() |
3715afbb97 | ||
![]() |
6f22f720e9 | ||
![]() |
785fcfb998 | ||
![]() |
ddce6e1f0a | ||
![]() |
199affb244 | ||
![]() |
899e3eed1b | ||
![]() |
744a0f1c8c | ||
![]() |
c6fc435869 | ||
![]() |
81ef51fd01 | ||
![]() |
0eaa1e3385 | ||
![]() |
e9a3a53849 | ||
![]() |
4ae4fb65e6 | ||
![]() |
95a4d7e8a9 | ||
![]() |
bf30297829 | ||
![]() |
f5f6985e4e | ||
![]() |
efe5df3459 | ||
![]() |
99409f258f | ||
![]() |
4400915e0c | ||
![]() |
37f6b249dc | ||
![]() |
4826c645cd | ||
![]() |
545b8e2b6e | ||
![]() |
d6744f0996 | ||
![]() |
55b980991b | ||
![]() |
add09db40a | ||
![]() |
f08901c639 | ||
![]() |
fd58096112 | ||
![]() |
3a652a8941 | ||
![]() |
52e6b45580 | ||
![]() |
7b7ba83a87 | ||
![]() |
8ba282362d | ||
![]() |
4451acbf0b | ||
![]() |
d257eeeabf | ||
![]() |
9f062fa14b | ||
![]() |
ba1038b4a8 | ||
![]() |
ef1b971fca | ||
![]() |
3444f9e5f2 | ||
![]() |
af768b8d66 | ||
![]() |
fbe0cdadd4 | ||
![]() |
ff3d455f45 | ||
![]() |
2eba2942d0 | ||
![]() |
556143a941 | ||
![]() |
e82011a91d | ||
![]() |
f2f80685a8 | ||
![]() |
7aa77eab07 | ||
![]() |
419fe7fc68 | ||
![]() |
1d4b5e243c | ||
![]() |
d25604da81 | ||
![]() |
17617b82c6 | ||
![]() |
3f89fdf300 | ||
![]() |
0cc9b7e357 | ||
![]() |
32e1188207 | ||
![]() |
c634e6a801 | ||
![]() |
d1a7557405 | ||
![]() |
b1a7c27d96 | ||
![]() |
5ebf8534ba | ||
![]() |
65bb5d067a | ||
![]() |
0a5da90f84 | ||
![]() |
59827290dc | ||
![]() |
b0f8d15542 | ||
![]() |
366ffec165 | ||
![]() |
0cd3059831 | ||
![]() |
db4da7092b | ||
![]() |
c6851c9f70 | ||
![]() |
7c4a098977 | ||
![]() |
cfb3db311c | ||
![]() |
b049ad4f3a | ||
![]() |
fd5a761c8c | ||
![]() |
e9befc44ec | ||
![]() |
180c82c694 | ||
![]() |
36b041376c | ||
![]() |
387db5ffb6 | ||
![]() |
ee27e71e29 | ||
![]() |
189110ec09 | ||
![]() |
4fc44060ea | ||
![]() |
c945b90b64 | ||
![]() |
7113a18f3b | ||
![]() |
921000625e | ||
![]() |
b272d695e3 | ||
![]() |
e93c85ba92 | ||
![]() |
6d58491d1c | ||
![]() |
413897639f | ||
![]() |
065d9bc1e3 | ||
![]() |
bb56254e88 | ||
![]() |
8b01336d54 | ||
![]() |
4d5f1bd81a | ||
![]() |
78efd4961e | ||
![]() |
1ab157c2a9 | ||
![]() |
10bde76494 | ||
![]() |
81499665d8 | ||
![]() |
8621546736 | ||
![]() |
abd27342b6 | ||
![]() |
f9a3a7f2c6 | ||
![]() |
165b476635 | ||
![]() |
7448609ba7 | ||
![]() |
7380e056cd | ||
![]() |
0b6c749b60 | ||
![]() |
1ba1df2529 | ||
![]() |
c7541cb516 | ||
![]() |
8ef913b117 | ||
![]() |
e1c29d09e7 | ||
![]() |
1b7f1614b5 | ||
![]() |
dcb8c17c75 | ||
![]() |
f42811285d | ||
![]() |
f9aa40fb57 | ||
![]() |
d3900536a1 | ||
![]() |
01c139ef2d | ||
![]() |
6442f1c572 | ||
![]() |
8b5a35cc70 | ||
![]() |
8d443f212a | ||
![]() |
6ab2e881a3 | ||
![]() |
c9baa76afd | ||
![]() |
66b9a99e71 | ||
![]() |
27a835a69c | ||
![]() |
1830cd856a | ||
![]() |
c656610c93 | ||
![]() |
e32f63d76e | ||
![]() |
7eca68dfc0 | ||
![]() |
75e3506046 | ||
![]() |
c839a5dd55 | ||
![]() |
7349ab063d | ||
![]() |
165f1d3341 | ||
![]() |
2a91a09835 | ||
![]() |
8f20003687 | ||
![]() |
94e07fee96 | ||
![]() |
c6618b2cc0 | ||
![]() |
7af291622f | ||
![]() |
e7603eff5f | ||
![]() |
866f365e0b | ||
![]() |
eac40ccd26 | ||
![]() |
1e6d9dda40 | ||
![]() |
7ecf42d70f | ||
![]() |
d92e48bc33 | ||
![]() |
51be0f07a5 | ||
![]() |
6751ed33af | ||
![]() |
5313909cd8 | ||
![]() |
a6e65583f2 | ||
![]() |
e17a33e110 | ||
![]() |
2a99765020 | ||
![]() |
98f2ba949c | ||
![]() |
66318b7e39 | ||
![]() |
eb91b02ebf | ||
![]() |
c643c0fe36 | ||
![]() |
285579822a | ||
![]() |
adacb2ab92 | ||
![]() |
e338274669 | ||
![]() |
7ce65cbbaa | ||
![]() |
7e0345cda6 | ||
![]() |
c27a7ac60a | ||
![]() |
0e7f03c595 | ||
![]() |
6bc44fd93f | ||
![]() |
074decc320 | ||
![]() |
ccc99a45f9 | ||
![]() |
dab8b6ac87 | ||
![]() |
a134ca098a | ||
![]() |
8d2a809df2 | ||
![]() |
0b8db1fec9 | ||
![]() |
791e6cf54f | ||
![]() |
14a117acfa | ||
![]() |
6eef72d524 | ||
![]() |
05543e5341 | ||
![]() |
4b64b80d86 | ||
![]() |
c20c1f96ca | ||
![]() |
7194915bf3 | ||
![]() |
8156580e4f | ||
![]() |
b63a01f527 | ||
![]() |
10b16dbcdf | ||
![]() |
4d24f92914 | ||
![]() |
8db6d9cd34 | ||
![]() |
821398437f | ||
![]() |
1a2213b1fb | ||
![]() |
e49b4c137e | ||
![]() |
cabb5cafca | ||
![]() |
016f7d90dc | ||
![]() |
934d5f7164 | ||
![]() |
a2c0d6d920 | ||
![]() |
7e1161db62 | ||
![]() |
4cfdb54a0e | ||
![]() |
b35544a44d | ||
![]() |
46a4ec7dd0 | ||
![]() |
35202662f3 | ||
![]() |
d1267dc398 | ||
![]() |
37b1b9ae39 | ||
![]() |
31f9938d1b | ||
![]() |
f8b5ec0f7d | ||
![]() |
1dc474d33d | ||
![]() |
1dfe86a069 | ||
![]() |
63778935b0 | ||
![]() |
39f7f5e512 | ||
![]() |
e320c32182 | ||
![]() |
cd8c524b19 | ||
![]() |
f7554e7f33 | ||
![]() |
6ae39361a5 | ||
![]() |
dfe88864ad | ||
![]() |
8c14d05494 | ||
![]() |
bc77cbf956 | ||
![]() |
0bb9f47590 | ||
![]() |
effa3cd3a5 | ||
![]() |
fd811d68ff | ||
![]() |
fda64e81a5 | ||
![]() |
3c598dfca9 | ||
![]() |
0c1e1b6b68 | ||
![]() |
7f2f4c666e | ||
![]() |
94e6e36f26 | ||
![]() |
45a534e176 | ||
![]() |
3303ee1afc | ||
![]() |
337330a508 | ||
![]() |
8b6001c328 | ||
![]() |
9ebf3c270c | ||
![]() |
fbde0ce88f | ||
![]() |
5009989a59 | ||
![]() |
436368ad22 | ||
![]() |
cc7bbeca97 | ||
![]() |
d608d6a441 | ||
![]() |
cd80f7ddbe | ||
![]() |
5bff97ec96 | ||
![]() |
23c1ea1b79 | ||
![]() |
097d1a55fe | ||
![]() |
13426d3c3f | ||
![]() |
e4859412bc | ||
![]() |
95309ad4ae | ||
![]() |
41bdf1bcc9 | ||
![]() |
04a4cd7d76 | ||
![]() |
582f7a9301 | ||
![]() |
9bc48fed73 | ||
![]() |
3e7e7023fc | ||
![]() |
8c853e56e4 | ||
![]() |
1941be8f53 | ||
![]() |
06253ed8f2 | ||
![]() |
ec5416abfd | ||
![]() |
940d056657 | ||
![]() |
a10d331296 | ||
![]() |
c2035b4324 | ||
![]() |
879a239bcd | ||
![]() |
4d1e33af9e | ||
![]() |
64c7699a19 | ||
![]() |
0219aa97fe | ||
![]() |
657ed211c0 | ||
![]() |
305666378b | ||
![]() |
50102af229 | ||
![]() |
ff4fa4f164 | ||
![]() |
3e35d9cd8c | ||
![]() |
e1a36972b1 | ||
![]() |
250575acb9 | ||
![]() |
d9cd312d07 | ||
![]() |
d733de9d9f | ||
![]() |
088cc9cd71 | ||
![]() |
895781c5a3 | ||
![]() |
f493bd539d | ||
![]() |
de9fbd48df | ||
![]() |
1e58b0c43e | ||
![]() |
68cfde5b65 | ||
![]() |
99157b76b2 | ||
![]() |
5f58a329d3 | ||
![]() |
67b9977d57 | ||
![]() |
b469937d4d | ||
![]() |
30b540045e | ||
![]() |
87af195862 | ||
![]() |
0ff001b4de | ||
![]() |
d6280b7939 | ||
![]() |
07f10d3907 | ||
![]() |
9fb18802fc | ||
![]() |
0936a2c474 | ||
![]() |
edd21d1233 | ||
![]() |
86e4ec03dd | ||
![]() |
994d502c41 | ||
![]() |
3d3de5d0fe | ||
![]() |
f303a1c423 | ||
![]() |
1fa814e6e5 | ||
![]() |
0aed34cb71 | ||
![]() |
abddbb11c1 | ||
![]() |
a25ba99bfb | ||
![]() |
512615631a | ||
![]() |
98086fd04b | ||
![]() |
fe2aa2bbf3 | ||
![]() |
da361e9d64 | ||
![]() |
f75d4ccc89 | ||
![]() |
c2621f4633 | ||
![]() |
e6602f1153 | ||
![]() |
069e78881d | ||
![]() |
e4e7ee8433 | ||
![]() |
ddb25ce281 | ||
![]() |
b4863173bd | ||
![]() |
140dbe1686 | ||
![]() |
f2f7b71367 | ||
![]() |
10a57687a3 | ||
![]() |
f84be89e49 | ||
![]() |
fb6369f43c | ||
![]() |
4b7c8c6157 | ||
![]() |
81e8317326 | ||
![]() |
3d6ca2cda3 | ||
![]() |
0740473ecc | ||
![]() |
1bce697475 | ||
![]() |
baaae7f76c | ||
![]() |
bb19d4de9f | ||
![]() |
6880f72472 | ||
![]() |
89fda615f6 | ||
![]() |
2c84d29050 | ||
![]() |
85399cf99f | ||
![]() |
e655b40d3a | ||
![]() |
3e94f3c58c | ||
![]() |
61a42f45ab | ||
![]() |
4e7863c27f | ||
![]() |
f9a5081438 | ||
![]() |
ea5d7ac6f0 | ||
![]() |
b318c0ab04 | ||
![]() |
3d0a7c73e9 | ||
![]() |
e98bf547e5 | ||
![]() |
8cb2e588fb | ||
![]() |
e4f99de13d | ||
![]() |
206d5be46c | ||
![]() |
e0f80c4364 | ||
![]() |
86891323ea | ||
![]() |
69e315ee8a | ||
![]() |
ced3aefc54 | ||
![]() |
6abf26bd99 | ||
![]() |
4667a7a6a2 | ||
![]() |
6eb5f3a518 | ||
![]() |
6f87b7830d | ||
![]() |
ccf974b1b7 | ||
![]() |
d9016ed619 | ||
![]() |
bd2eade33d | ||
![]() |
11e8ea471d | ||
![]() |
992bef1553 | ||
![]() |
396235be3e | ||
![]() |
9662463510 | ||
![]() |
64e1590d9e | ||
![]() |
c751690e79 | ||
![]() |
f76915117b | ||
![]() |
ba915844a4 | ||
![]() |
59d96d170e | ||
![]() |
a9a27eda51 | ||
![]() |
249de7ec6d | ||
![]() |
8aabf4805a | ||
![]() |
2ab06d7294 | ||
![]() |
e71f38deea | ||
![]() |
0d70925abe | ||
![]() |
89bb73aa25 | ||
![]() |
1c93f26034 | ||
![]() |
86e44da695 | ||
![]() |
530ad72e18 | ||
![]() |
ae185a1848 | ||
![]() |
e3682b69f7 | ||
![]() |
fe72fc173d | ||
![]() |
3b4663639d | ||
![]() |
51b20a6e17 | ||
![]() |
ad9122cd4f | ||
![]() |
7d8bf75e44 | ||
![]() |
e36e4f61b3 | ||
![]() |
5612ad08dd | ||
![]() |
7bc8325df9 | ||
![]() |
2aca64bfcd | ||
![]() |
e9956234a3 | ||
![]() |
d656de8883 | ||
![]() |
32f0ebaac3 | ||
![]() |
73700437e4 | ||
![]() |
14d96350c3 | ||
![]() |
4e9c4ebd02 | ||
![]() |
097fe424c2 | ||
![]() |
12abb592be | ||
![]() |
4f0951402a | ||
![]() |
b555b3664f | ||
![]() |
2dcacaee14 | ||
![]() |
bf2cdae3b1 | ||
![]() |
64304c5121 | ||
![]() |
7680bf7c0d | ||
![]() |
757507e634 | ||
![]() |
07d3f47978 | ||
![]() |
ec707bd02a | ||
![]() |
3cc463b2c3 | ||
![]() |
99e5cf0cdc | ||
![]() |
ab0ec0d5a5 | ||
![]() |
e0a2af3d98 | ||
![]() |
524bc2722f | ||
![]() |
23d8305027 | ||
![]() |
1950f805b3 | ||
![]() |
d275a75517 | ||
![]() |
1576e07011 | ||
![]() |
07faba3983 | ||
![]() |
d9547e87af | ||
![]() |
d672455ad8 | ||
![]() |
96a572b7a6 | ||
![]() |
2a7bb27a78 | ||
![]() |
3f73cd5bfc | ||
![]() |
9c427f202d | ||
![]() |
bc02dc30b1 | ||
![]() |
8080b7a119 | ||
![]() |
b0fa0f2319 | ||
![]() |
6c5f9bbabb | ||
![]() |
a5edbcf9ef | ||
![]() |
a6bd118f5e | ||
![]() |
a73ff4ffa6 | ||
![]() |
14d04ff8d2 | ||
![]() |
d7cefabe98 | ||
![]() |
c926e0242a | ||
![]() |
9ac19417e7 | ||
![]() |
795dbc16d2 | ||
![]() |
29a7737829 | ||
![]() |
3e0c3c0bb2 | ||
![]() |
cde867c9c6 | ||
![]() |
ff8eb7fbce | ||
![]() |
3766d8687f | ||
![]() |
8f08201633 | ||
![]() |
893dcd9f28 | ||
![]() |
79059b9aa7 | ||
![]() |
861b2efb1d | ||
![]() |
cbf844e7db | ||
![]() |
88b39faa7e | ||
![]() |
08495deebb | ||
![]() |
05b9af6195 |
9
.build-config.json
Normal file
@@ -0,0 +1,9 @@
|
||||
{
|
||||
"qpdf": {
|
||||
"version": "10.6.3"
|
||||
},
|
||||
"jbig2enc": {
|
||||
"version": "0.29",
|
||||
"git_tag": "0.29"
|
||||
}
|
||||
}
|
@@ -17,3 +17,5 @@
|
||||
**/htmlcov
|
||||
/src/.pytest_cache
|
||||
.idea
|
||||
.venv/
|
||||
.vscode/
|
||||
|
@@ -18,7 +18,16 @@ max_line_length = off
|
||||
indent_size = 4
|
||||
indent_style = space
|
||||
|
||||
[*.yml]
|
||||
[*.{yml,yaml}]
|
||||
indent_style = space
|
||||
|
||||
[*.rst]
|
||||
indent_style = space
|
||||
|
||||
[*.md]
|
||||
indent_style = space
|
||||
|
||||
[Pipfile.lock]
|
||||
indent_style = space
|
||||
|
||||
# Tests don't get a line width restriction. It's still a good idea to follow
|
||||
@@ -26,3 +35,6 @@ indent_style = space
|
||||
# violate it.
|
||||
[**/test_*.py]
|
||||
max_line_length = off
|
||||
|
||||
[Dockerfile*]
|
||||
indent_style = space
|
||||
|
1
.env
@@ -1 +1,2 @@
|
||||
COMPOSE_PROJECT_NAME=paperless
|
||||
export PROMPT="(pipenv-projectname)$P$G"
|
||||
|
88
.github/ISSUE_TEMPLATE/bug-report.yml
vendored
Normal file
@@ -0,0 +1,88 @@
|
||||
name: Bug report
|
||||
description: Something is not working
|
||||
title: "[BUG] Concise description of the issue"
|
||||
labels: ["bug", "unconfirmed"]
|
||||
body:
|
||||
- type: markdown
|
||||
attributes:
|
||||
value: |
|
||||
Have a question? 👉 [Start a new discussion](https://github.com/paperless-ngx/paperless-ngx/discussions/new) or [ask in chat](https://matrix.to/#/#paperless:adnidor.de).
|
||||
|
||||
Before opening an issue, please double check:
|
||||
|
||||
- [The troubleshooting documentation](https://paperless-ngx.readthedocs.io/en/latest/troubleshooting.html).
|
||||
- [The installation instructions](https://paperless-ngx.readthedocs.io/en/latest/setup.html#installation).
|
||||
- [Existing issues and discussions](https://github.com/paperless-ngx/paperless-ngx/search?q=&type=issues).
|
||||
|
||||
If you encounter issues while installing or configuring Paperless-ngx, please post in the ["Support" section of the discussions](https://github.com/paperless-ngx/paperless-ngx/discussions/new?category=support).
|
||||
- type: textarea
|
||||
id: description
|
||||
attributes:
|
||||
label: Description
|
||||
description: A clear and concise description of what the bug is. If applicable, add screenshots to help explain your problem.
|
||||
placeholder: |
|
||||
Currently Paperless does not work when...
|
||||
|
||||
[Screenshot if applicable]
|
||||
validations:
|
||||
required: true
|
||||
- type: textarea
|
||||
id: reproduction
|
||||
attributes:
|
||||
label: Steps to reproduce
|
||||
description: Steps to reproduce the behavior.
|
||||
placeholder: |
|
||||
1. Go to '...'
|
||||
2. Click on '....'
|
||||
3. See error
|
||||
validations:
|
||||
required: true
|
||||
- type: textarea
|
||||
id: logs
|
||||
attributes:
|
||||
label: Webserver logs
|
||||
description: If available, post any logs from the web server related to your issue.
|
||||
render: bash
|
||||
- type: input
|
||||
id: version
|
||||
attributes:
|
||||
label: Paperless-ngx version
|
||||
placeholder: e.g. 1.6.0
|
||||
validations:
|
||||
required: true
|
||||
- type: input
|
||||
id: host-os
|
||||
attributes:
|
||||
label: Host OS
|
||||
description: Host OS of the machine running paperless-ngx. Please add the architecture (uname -m) if applicable.
|
||||
placeholder: e.g. Archlinux / Ubuntu 20.04 / Raspberry Pi `arm64`
|
||||
validations:
|
||||
required: true
|
||||
- type: dropdown
|
||||
id: install-method
|
||||
attributes:
|
||||
label: Installation method
|
||||
options:
|
||||
- Docker - official image
|
||||
- Docker - linuxserver.io image
|
||||
- Bare metal
|
||||
- Other (please describe above)
|
||||
description: Note there are significant differences from the official image and linuxserver.io, please check if your issue is specific to the third-party image.
|
||||
validations:
|
||||
required: true
|
||||
- type: input
|
||||
id: browser
|
||||
attributes:
|
||||
label: Browser
|
||||
description: Which browser you are using, if relevant.
|
||||
placeholder: e.g. Chrome, Safari
|
||||
- type: input
|
||||
id: config-changes
|
||||
attributes:
|
||||
label: Configuration changes
|
||||
description: Any configuration changes you made in `docker-compose.yml`, `docker-compose.env` or `paperless.conf`.
|
||||
- type: input
|
||||
id: other
|
||||
attributes:
|
||||
label: Other
|
||||
description: Any other relevant details.
|
48
.github/ISSUE_TEMPLATE/bug_report.md
vendored
@@ -1,48 +0,0 @@
|
||||
---
|
||||
name: Bug report
|
||||
about: Something is not working
|
||||
title: "[BUG] Concise description of the issue"
|
||||
labels: ''
|
||||
assignees: ''
|
||||
|
||||
---
|
||||
|
||||
<!---
|
||||
=> Before opening an issue, please check the documentation and see if it helps you resolve your issue: https://paperless-ng.readthedocs.io/en/latest/troubleshooting.html
|
||||
=> Please also make sure that you followed the installation instructions.
|
||||
=> Please search the issues and look for similar issues before opening a bug report.
|
||||
|
||||
=> If you would like to submit a feature request please submit one under https://github.com/jonaswinkler/paperless-ng/discussions/categories/feature-requests
|
||||
|
||||
=> If you encounter issues while installing of configuring Paperless-ng, please post that in the "Support" section of the discussions. Remember that Paperless successfully runs on a variety of different systems. If paperless does not start, it's probably an issue with your system, and not an issue of paperless.
|
||||
|
||||
=> Don't remove the [BUG] prefix from the title.
|
||||
-->
|
||||
|
||||
**Describe the bug**
|
||||
A clear and concise description of what the bug is.
|
||||
|
||||
**To Reproduce**
|
||||
Steps to reproduce the behavior:
|
||||
1. Go to '...'
|
||||
2. Click on '....'
|
||||
3. Scroll down to '....'
|
||||
4. See error
|
||||
|
||||
**Expected behavior**
|
||||
A clear and concise description of what you expected to happen.
|
||||
|
||||
**Screenshots**
|
||||
If applicable, add screenshots to help explain your problem.
|
||||
|
||||
**Webserver logs**
|
||||
```
|
||||
If available, post any logs from the web server related to your issue.
|
||||
```
|
||||
|
||||
**Relevant information**
|
||||
- Host OS of the machine running paperless: [e.g. Archlinux / Ubuntu 20.04]
|
||||
- Browser [e.g. chrome, safari]
|
||||
- Version [e.g. 1.0.0]
|
||||
- Installation method: [docker / bare metal]
|
||||
- Any configuration changes you made in `docker-compose.yml`, `docker-compose.env` or `paperless.conf`.
|
11
.github/ISSUE_TEMPLATE/config.yml
vendored
Normal file
@@ -0,0 +1,11 @@
|
||||
blank_issues_enabled: false
|
||||
contact_links:
|
||||
- name: 🤔 Questions and Help
|
||||
url: https://github.com/paperless-ngx/paperless-ngx/discussions
|
||||
about: This issue tracker is not for support questions. Please refer to our Discussions.
|
||||
- name: 💬 Chat
|
||||
url: https://matrix.to/#/#paperless:adnidor.de
|
||||
about: Want to discuss Paperless-ngx with others? Check out our chat.
|
||||
- name: 🚀 Feature Request
|
||||
url: https://github.com/paperless-ngx/paperless-ngx/discussions/new?category=feature-requests
|
||||
about: Remember to search for existing feature requests and "up-vote" any you like
|
20
.github/ISSUE_TEMPLATE/other.md
vendored
@@ -1,20 +0,0 @@
|
||||
---
|
||||
name: Other
|
||||
about: Anything that is not a feature request or bug.
|
||||
title: "[Other] Title of your issue"
|
||||
labels: ''
|
||||
assignees: ''
|
||||
|
||||
---
|
||||
|
||||
<!--
|
||||
|
||||
=> Discussions, Feedback and other suggestions belong in the "Discussion" section and not on the issue tracker.
|
||||
|
||||
=> If you would like to submit a feature request please submit one under https://github.com/jonaswinkler/paperless-ng/discussions/categories/feature-requests
|
||||
|
||||
=> If you encounter issues while installing of configuring Paperless-ng, please post that in the "Support" section of the discussions. Remember that Paperless successfully runs on a variety of different systems. If paperless does not start, it's probably is an issue with your system, and not an issue of paperless.
|
||||
|
||||
=> Don't remove the [Other] prefix from the title.
|
||||
|
||||
-->
|
32
.github/PULL_REQUEST_TEMPLATE.md
vendored
Normal file
@@ -0,0 +1,32 @@
|
||||
<!--
|
||||
Note: All PRs with code changes should be targeted to the `dev` branch, pure documentation changes can target `main`
|
||||
-->
|
||||
|
||||
## Proposed change
|
||||
|
||||
<!--
|
||||
Please include a summary of the change and which issue is fixed (if any) and any relevant motivation / context. List any dependencies that are required for this change. If appropriate, please include an explanation of how your proposed change can be tested. Screenshots and / or videos can also be helpful if appropriate.
|
||||
-->
|
||||
|
||||
Fixes # (issue)
|
||||
|
||||
## Type of change
|
||||
|
||||
<!--
|
||||
What type of change does your PR introduce to Paperless-ngx?
|
||||
NOTE: Please check only one box!
|
||||
-->
|
||||
|
||||
- [ ] Bug fix (non-breaking change which fixes an issue)
|
||||
- [ ] New feature (non-breaking change which adds functionality)
|
||||
- [ ] Breaking change (fix or feature that would cause existing functionality to not work as expected)
|
||||
- [ ] Other (please explain)
|
||||
|
||||
## Checklist:
|
||||
|
||||
- [ ] I have read & agree with the [contributing guidelines](https://github.com/paperless-ngx/paperless-ngx/blob/main/CONTRIBUTING.md).
|
||||
- [ ] If applicable, I have tested my code for new features & regressions on both mobile & desktop devices, using the latest version of major browsers.
|
||||
- [ ] If applicable, I have checked that all tests pass, see [documentation](https://paperless-ngx.readthedocs.io/en/latest/extending.html#back-end-development).
|
||||
- [ ] I have run all `pre-commit` hooks, see [documentation](https://paperless-ngx.readthedocs.io/en/latest/extending.html#code-formatting-with-pre-commit-hooks).
|
||||
- [ ] I have made corresponding changes to the documentation as needed.
|
||||
- [ ] I have checked my modifications for any breaking changes.
|
48
.github/dependabot.yml
vendored
Normal file
@@ -0,0 +1,48 @@
|
||||
# https://docs.github.com/en/code-security/supply-chain-security/keeping-your-dependencies-updated-automatically/configuration-options-for-dependency-updates#package-ecosystem
|
||||
|
||||
version: 2
|
||||
updates:
|
||||
|
||||
# Enable version updates for npm
|
||||
- package-ecosystem: "npm"
|
||||
target-branch: "dev"
|
||||
# Look for `package.json` and `lock` files in the `/src-ui` directory
|
||||
directory: "/src-ui"
|
||||
# Check the npm registry for updates every month
|
||||
schedule:
|
||||
interval: "monthly"
|
||||
labels:
|
||||
- "frontend"
|
||||
- "dependencies"
|
||||
# Add reviewers
|
||||
reviewers:
|
||||
- "paperless-ngx/frontend"
|
||||
|
||||
# Enable version updates for Python
|
||||
- package-ecosystem: "pip"
|
||||
target-branch: "dev"
|
||||
# Look for a `Pipfile` in the `root` directory
|
||||
directory: "/"
|
||||
# Check for updates once a week
|
||||
schedule:
|
||||
interval: "weekly"
|
||||
labels:
|
||||
- "backend"
|
||||
- "dependencies"
|
||||
# Add reviewers
|
||||
reviewers:
|
||||
- "paperless-ngx/backend"
|
||||
|
||||
# Enable updates for Github Actions
|
||||
- package-ecosystem: "github-actions"
|
||||
target-branch: "dev"
|
||||
directory: "/"
|
||||
schedule:
|
||||
# Check for updates to GitHub Actions every month
|
||||
interval: "monthly"
|
||||
labels:
|
||||
- "ci-cd"
|
||||
- "dependencies"
|
||||
# Add reviewers
|
||||
reviewers:
|
||||
- "paperless-ngx/ci-cd"
|
55
.github/release-drafter.yml
vendored
Normal file
@@ -0,0 +1,55 @@
|
||||
autolabeler:
|
||||
- label: "bug"
|
||||
branch:
|
||||
- '/^fix/'
|
||||
title:
|
||||
- "/^fix/i"
|
||||
- label: "enhancement"
|
||||
branch:
|
||||
- '/^feature/'
|
||||
title:
|
||||
- "/^feature/i"
|
||||
categories:
|
||||
- title: 'Breaking Changes'
|
||||
labels:
|
||||
- 'breaking-change'
|
||||
- title: 'Features'
|
||||
labels:
|
||||
- 'enhancement'
|
||||
- title: 'Bug Fixes'
|
||||
labels:
|
||||
- 'bug'
|
||||
- title: 'Documentation'
|
||||
label: 'documentation'
|
||||
- title: 'Maintenance'
|
||||
labels:
|
||||
- 'chore'
|
||||
- 'deployment'
|
||||
- 'translation'
|
||||
- 'ci-cd'
|
||||
- title: 'Dependencies'
|
||||
collapse-after: 3
|
||||
label: 'dependencies'
|
||||
- title: 'All App Changes'
|
||||
labels:
|
||||
- 'frontend'
|
||||
- 'backend'
|
||||
collapse-after: 0
|
||||
include-labels:
|
||||
- 'enhancement'
|
||||
- 'bug'
|
||||
- 'chore'
|
||||
- 'deployment'
|
||||
- 'translation'
|
||||
- 'dependencies'
|
||||
- 'documentation'
|
||||
- 'frontend'
|
||||
- 'backend'
|
||||
- 'ci-cd'
|
||||
category-template: '### $TITLE'
|
||||
change-template: '- $TITLE @$AUTHOR ([#$NUMBER]($URL))'
|
||||
change-title-escapes: '\<*_&#@'
|
||||
template: |
|
||||
## paperless-ngx $RESOLVED_VERSION
|
||||
|
||||
$CHANGES
|
290
.github/scripts/cleanup-tags.py
vendored
Normal file
@@ -0,0 +1,290 @@
|
||||
#!/usr/bin/env python3
|
||||
import json
|
||||
import logging
|
||||
import os
|
||||
import shutil
|
||||
import subprocess
|
||||
from argparse import ArgumentParser
|
||||
from typing import Dict
|
||||
from typing import Final
|
||||
from typing import List
|
||||
|
||||
from common import get_log_level
|
||||
from github import ContainerPackage
|
||||
from github import GithubBranchApi
|
||||
from github import GithubContainerRegistryApi
|
||||
|
||||
logger = logging.getLogger("cleanup-tags")
|
||||
|
||||
|
||||
class DockerManifest2:
|
||||
"""
|
||||
Data class wrapping the Docker Image Manifest Version 2.
|
||||
|
||||
See https://docs.docker.com/registry/spec/manifest-v2-2/
|
||||
"""
|
||||
|
||||
def __init__(self, data: Dict) -> None:
|
||||
self._data = data
|
||||
# This is the sha256: digest string. Corresponds to Github API name
|
||||
# if the package is an untagged package
|
||||
self.digest = self._data["digest"]
|
||||
platform_data_os = self._data["platform"]["os"]
|
||||
platform_arch = self._data["platform"]["architecture"]
|
||||
platform_variant = self._data["platform"].get(
|
||||
"variant",
|
||||
"",
|
||||
)
|
||||
self.platform = f"{platform_data_os}/{platform_arch}{platform_variant}"
|
||||
|
||||
|
||||
def _main():
|
||||
parser = ArgumentParser(
|
||||
description="Using the GitHub API locate and optionally delete container"
|
||||
" tags which no longer have an associated feature branch",
|
||||
)
|
||||
|
||||
# Requires an affirmative command to actually do a delete
|
||||
parser.add_argument(
|
||||
"--delete",
|
||||
action="store_true",
|
||||
default=False,
|
||||
help="If provided, actually delete the container tags",
|
||||
)
|
||||
|
||||
# When a tagged image is updated, the previous version remains, but it no longer tagged
|
||||
# Add this option to remove them as well
|
||||
parser.add_argument(
|
||||
"--untagged",
|
||||
action="store_true",
|
||||
default=False,
|
||||
help="If provided, delete untagged containers as well",
|
||||
)
|
||||
|
||||
# If given, the package is assumed to be a multi-arch manifest. Cache packages are
|
||||
# not multi-arch, all other types are
|
||||
parser.add_argument(
|
||||
"--is-manifest",
|
||||
action="store_true",
|
||||
default=False,
|
||||
help="If provided, the package is assumed to be a multi-arch manifest following schema v2",
|
||||
)
|
||||
|
||||
# Allows configuration of log level for debugging
|
||||
parser.add_argument(
|
||||
"--loglevel",
|
||||
default="info",
|
||||
help="Configures the logging level",
|
||||
)
|
||||
|
||||
# Get the name of the package being processed this round
|
||||
parser.add_argument(
|
||||
"package",
|
||||
help="The package to process",
|
||||
)
|
||||
|
||||
args = parser.parse_args()
|
||||
|
||||
logging.basicConfig(
|
||||
level=get_log_level(args),
|
||||
datefmt="%Y-%m-%d %H:%M:%S",
|
||||
format="%(asctime)s %(levelname)-8s %(message)s",
|
||||
)
|
||||
|
||||
# Must be provided in the environment
|
||||
repo_owner: Final[str] = os.environ["GITHUB_REPOSITORY_OWNER"]
|
||||
repo: Final[str] = os.environ["GITHUB_REPOSITORY"]
|
||||
gh_token: Final[str] = os.environ["TOKEN"]
|
||||
|
||||
# Find all branches named feature-*
|
||||
# Note: Only relevant to the main application, but simpler to
|
||||
# leave in for all packages
|
||||
with GithubBranchApi(gh_token) as branch_api:
|
||||
feature_branches = {}
|
||||
for branch in branch_api.get_branches(
|
||||
repo=repo,
|
||||
):
|
||||
if branch.name.startswith("feature-"):
|
||||
logger.debug(f"Found feature branch {branch.name}")
|
||||
feature_branches[branch.name] = branch
|
||||
|
||||
logger.info(f"Located {len(feature_branches)} feature branches")
|
||||
|
||||
with GithubContainerRegistryApi(gh_token, repo_owner) as container_api:
|
||||
# Get the information about all versions of the given package
|
||||
all_package_versions: List[
|
||||
ContainerPackage
|
||||
] = container_api.get_package_versions(args.package)
|
||||
|
||||
all_pkgs_tags_to_version: Dict[str, ContainerPackage] = {}
|
||||
for pkg in all_package_versions:
|
||||
for tag in pkg.tags:
|
||||
all_pkgs_tags_to_version[tag] = pkg
|
||||
logger.info(
|
||||
f"Located {len(all_package_versions)} versions of package {args.package}",
|
||||
)
|
||||
|
||||
# Filter to packages which are tagged with feature-*
|
||||
packages_tagged_feature: List[ContainerPackage] = []
|
||||
for package in all_package_versions:
|
||||
if package.tag_matches("feature-"):
|
||||
packages_tagged_feature.append(package)
|
||||
|
||||
feature_pkgs_tags_to_versions: Dict[str, ContainerPackage] = {}
|
||||
for pkg in packages_tagged_feature:
|
||||
for tag in pkg.tags:
|
||||
feature_pkgs_tags_to_versions[tag] = pkg
|
||||
|
||||
logger.info(
|
||||
f'Located {len(feature_pkgs_tags_to_versions)} versions of package {args.package} tagged "feature-"',
|
||||
)
|
||||
|
||||
# All the feature tags minus all the feature branches leaves us feature tags
|
||||
# with no corresponding branch
|
||||
tags_to_delete = list(
|
||||
set(feature_pkgs_tags_to_versions.keys()) - set(feature_branches.keys()),
|
||||
)
|
||||
|
||||
# All the tags minus the set of going to be deleted tags leaves us the
|
||||
# tags which will be kept around
|
||||
tags_to_keep = list(
|
||||
set(all_pkgs_tags_to_version.keys()) - set(tags_to_delete),
|
||||
)
|
||||
logger.info(
|
||||
f"Located {len(tags_to_delete)} versions of package {args.package} to delete",
|
||||
)
|
||||
|
||||
# Delete certain package versions for which no branch existed
|
||||
for tag_to_delete in tags_to_delete:
|
||||
package_version_info = feature_pkgs_tags_to_versions[tag_to_delete]
|
||||
|
||||
if args.delete:
|
||||
logger.info(
|
||||
f"Deleting {tag_to_delete} (id {package_version_info.id})",
|
||||
)
|
||||
container_api.delete_package_version(
|
||||
package_version_info,
|
||||
)
|
||||
|
||||
else:
|
||||
logger.info(
|
||||
f"Would delete {tag_to_delete} (id {package_version_info.id})",
|
||||
)
|
||||
|
||||
# Deal with untagged package versions
|
||||
if args.untagged:
|
||||
|
||||
logger.info("Handling untagged image packages")
|
||||
|
||||
if not args.is_manifest:
|
||||
# If the package is not a multi-arch manifest, images without tags are safe to delete.
|
||||
# They are not referred to by anything. This will leave all with at least 1 tag
|
||||
|
||||
for package in all_package_versions:
|
||||
if package.untagged:
|
||||
if args.delete:
|
||||
logger.info(
|
||||
f"Deleting id {package.id} named {package.name}",
|
||||
)
|
||||
container_api.delete_package_version(
|
||||
package,
|
||||
)
|
||||
else:
|
||||
logger.info(
|
||||
f"Would delete {package.name} (id {package.id})",
|
||||
)
|
||||
else:
|
||||
logger.info(
|
||||
f"Not deleting tag {package.tags[0]} of package {args.package}",
|
||||
)
|
||||
else:
|
||||
|
||||
"""
|
||||
Ok, bear with me, these are annoying.
|
||||
|
||||
Our images are multi-arch, so the manifest is more like a pointer to a sha256 digest.
|
||||
These images are untagged, but pointed to, and so should not be removed (or every pull fails).
|
||||
|
||||
So for each image getting kept, parse the manifest to find the digest(s) it points to. Then
|
||||
remove those from the list of untagged images. The final result is the untagged, not pointed to
|
||||
version which should be safe to remove.
|
||||
|
||||
Example:
|
||||
Tag: ghcr.io/paperless-ngx/paperless-ngx:1.7.1 refers to
|
||||
amd64: sha256:b9ed4f8753bbf5146547671052d7e91f68cdfc9ef049d06690b2bc866fec2690
|
||||
armv7: sha256:81605222df4ba4605a2ba4893276e5d08c511231ead1d5da061410e1bbec05c3
|
||||
arm64: sha256:374cd68db40734b844705bfc38faae84cc4182371de4bebd533a9a365d5e8f3b
|
||||
each of which appears as untagged image, but isn't really.
|
||||
|
||||
So from the list of untagged packages, remove those digests. Once all tags which
|
||||
are being kept are checked, the remaining untagged packages are actually untagged
|
||||
with no referrals in a manifest to them.
|
||||
|
||||
"""
|
||||
|
||||
# Simplify the untagged data, mapping name (which is a digest) to the version
|
||||
untagged_versions = {}
|
||||
for x in all_package_versions:
|
||||
if x.untagged:
|
||||
untagged_versions[x.name] = x
|
||||
|
||||
skips = 0
|
||||
# Extra security to not delete on an unexpected error
|
||||
actually_delete = True
|
||||
|
||||
# Parse manifests to locate digests pointed to
|
||||
for tag in sorted(tags_to_keep):
|
||||
full_name = f"ghcr.io/{repo_owner}/{args.package}:{tag}"
|
||||
logger.info(f"Checking manifest for {full_name}")
|
||||
try:
|
||||
proc = subprocess.run(
|
||||
[
|
||||
shutil.which("docker"),
|
||||
"manifest",
|
||||
"inspect",
|
||||
full_name,
|
||||
],
|
||||
capture_output=True,
|
||||
)
|
||||
|
||||
manifest_list = json.loads(proc.stdout)
|
||||
for manifest_data in manifest_list["manifests"]:
|
||||
manifest = DockerManifest2(manifest_data)
|
||||
|
||||
if manifest.digest in untagged_versions:
|
||||
logger.debug(
|
||||
f"Skipping deletion of {manifest.digest}, referred to by {full_name} for {manifest.platform}",
|
||||
)
|
||||
del untagged_versions[manifest.digest]
|
||||
skips += 1
|
||||
|
||||
except Exception as err:
|
||||
actually_delete = False
|
||||
logger.exception(err)
|
||||
|
||||
logger.info(
|
||||
f"Skipping deletion of {skips} packages referred to by a manifest",
|
||||
)
|
||||
|
||||
# Step 3.3 - Delete the untagged and not pointed at packages
|
||||
logger.info(f"Deleting untagged packages of {args.package}")
|
||||
for to_delete_name in untagged_versions:
|
||||
to_delete_version = untagged_versions[to_delete_name]
|
||||
|
||||
if args.delete and actually_delete:
|
||||
logger.info(
|
||||
f"Deleting id {to_delete_version.id} named {to_delete_version.name}",
|
||||
)
|
||||
container_api.delete_package_version(
|
||||
to_delete_version,
|
||||
)
|
||||
else:
|
||||
logger.info(
|
||||
f"Would delete {to_delete_name} (id {to_delete_version.id})",
|
||||
)
|
||||
else:
|
||||
logger.info("Leaving untagged images untouched")
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
_main()
|
43
.github/scripts/common.py
vendored
Normal file
@@ -0,0 +1,43 @@
|
||||
#!/usr/bin/env python3
|
||||
import logging
|
||||
|
||||
|
||||
def get_image_tag(
|
||||
repo_name: str,
|
||||
pkg_name: str,
|
||||
pkg_version: str,
|
||||
) -> str:
|
||||
"""
|
||||
Returns a string representing the normal image for a given package
|
||||
"""
|
||||
return f"ghcr.io/{repo_name.lower()}/builder/{pkg_name}:{pkg_version}"
|
||||
|
||||
|
||||
def get_cache_image_tag(
|
||||
repo_name: str,
|
||||
pkg_name: str,
|
||||
pkg_version: str,
|
||||
branch_name: str,
|
||||
) -> str:
|
||||
"""
|
||||
Returns a string representing the expected image cache tag for a given package
|
||||
|
||||
Registry type caching is utilized for the builder images, to allow fast
|
||||
rebuilds, generally almost instant for the same version
|
||||
"""
|
||||
return f"ghcr.io/{repo_name.lower()}/builder/cache/{pkg_name}:{pkg_version}"
|
||||
|
||||
|
||||
def get_log_level(args) -> int:
|
||||
levels = {
|
||||
"critical": logging.CRITICAL,
|
||||
"error": logging.ERROR,
|
||||
"warn": logging.WARNING,
|
||||
"warning": logging.WARNING,
|
||||
"info": logging.INFO,
|
||||
"debug": logging.DEBUG,
|
||||
}
|
||||
level = levels.get(args.loglevel.lower())
|
||||
if level is None:
|
||||
level = logging.INFO
|
||||
return level
|
92
.github/scripts/get-build-json.py
vendored
Executable file
@@ -0,0 +1,92 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
This is a helper script for the mutli-stage Docker image builder.
|
||||
It provides a single point of configuration for package version control.
|
||||
The output JSON object is used by the CI workflow to determine what versions
|
||||
to build and pull into the final Docker image.
|
||||
|
||||
Python package information is obtained from the Pipfile.lock. As this is
|
||||
kept updated by dependabot, it usually will need no further configuration.
|
||||
The sole exception currently is pikepdf, which has a dependency on qpdf,
|
||||
and is configured here to use the latest version of qpdf built by the workflow.
|
||||
|
||||
Other package version information is configured directly below, generally by
|
||||
setting the version and Git information, if any.
|
||||
|
||||
"""
|
||||
import argparse
|
||||
import json
|
||||
import os
|
||||
from pathlib import Path
|
||||
from typing import Final
|
||||
|
||||
from common import get_cache_image_tag
|
||||
from common import get_image_tag
|
||||
|
||||
|
||||
def _main():
|
||||
parser = argparse.ArgumentParser(
|
||||
description="Generate a JSON object of information required to build the given package, based on the Pipfile.lock",
|
||||
)
|
||||
parser.add_argument(
|
||||
"package",
|
||||
help="The name of the package to generate JSON for",
|
||||
)
|
||||
|
||||
PIPFILE_LOCK_PATH: Final[Path] = Path("Pipfile.lock")
|
||||
BUILD_CONFIG_PATH: Final[Path] = Path(".build-config.json")
|
||||
|
||||
# Read the main config file
|
||||
build_json: Final = json.loads(BUILD_CONFIG_PATH.read_text())
|
||||
|
||||
# Read Pipfile.lock file
|
||||
pipfile_data: Final = json.loads(PIPFILE_LOCK_PATH.read_text())
|
||||
|
||||
args: Final = parser.parse_args()
|
||||
|
||||
# Read from environment variables set by GitHub Actions
|
||||
repo_name: Final[str] = os.environ["GITHUB_REPOSITORY"]
|
||||
branch_name: Final[str] = os.environ["GITHUB_REF_NAME"]
|
||||
|
||||
# Default output values
|
||||
version = None
|
||||
extra_config = {}
|
||||
|
||||
if args.package in pipfile_data["default"]:
|
||||
# Read the version from Pipfile.lock
|
||||
pkg_data = pipfile_data["default"][args.package]
|
||||
pkg_version = pkg_data["version"].split("==")[-1]
|
||||
version = pkg_version
|
||||
|
||||
# Any extra/special values needed
|
||||
if args.package == "pikepdf":
|
||||
extra_config["qpdf_version"] = build_json["qpdf"]["version"]
|
||||
|
||||
elif args.package in build_json:
|
||||
version = build_json[args.package]["version"]
|
||||
|
||||
else:
|
||||
raise NotImplementedError(args.package)
|
||||
|
||||
# The JSON object we'll output
|
||||
output = {
|
||||
"name": args.package,
|
||||
"version": version,
|
||||
"image_tag": get_image_tag(repo_name, args.package, version),
|
||||
"cache_tag": get_cache_image_tag(
|
||||
repo_name,
|
||||
args.package,
|
||||
version,
|
||||
branch_name,
|
||||
),
|
||||
}
|
||||
|
||||
# Add anything special a package may need
|
||||
output.update(extra_config)
|
||||
|
||||
# Output the JSON info to stdout
|
||||
print(json.dumps(output))
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
_main()
|
227
.github/scripts/github.py
vendored
Normal file
@@ -0,0 +1,227 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
This module contains some useful classes for interacting with the Github API.
|
||||
The full documentation for the API can be found here: https://docs.github.com/en/rest
|
||||
|
||||
Mostly, this focusses on two areas, repo branches and repo packages, as the use case
|
||||
is cleaning up container images which are no longer referred to.
|
||||
|
||||
"""
|
||||
import functools
|
||||
import logging
|
||||
import re
|
||||
import urllib.parse
|
||||
from typing import Dict
|
||||
from typing import List
|
||||
from typing import Optional
|
||||
|
||||
import requests
|
||||
|
||||
logger = logging.getLogger("github-api")
|
||||
|
||||
|
||||
class _GithubApiBase:
|
||||
"""
|
||||
A base class for interacting with the Github API. It
|
||||
will handle the session and setting authorization headers.
|
||||
"""
|
||||
|
||||
def __init__(self, token: str) -> None:
|
||||
self._token = token
|
||||
self._session: Optional[requests.Session] = None
|
||||
|
||||
def __enter__(self) -> "_GithubApiBase":
|
||||
"""
|
||||
Sets up the required headers for auth and response
|
||||
type from the API
|
||||
"""
|
||||
self._session = requests.Session()
|
||||
self._session.headers.update(
|
||||
{
|
||||
"Accept": "application/vnd.github.v3+json",
|
||||
"Authorization": f"token {self._token}",
|
||||
},
|
||||
)
|
||||
return self
|
||||
|
||||
def __exit__(self, exc_type, exc_val, exc_tb):
|
||||
"""
|
||||
Ensures the authorization token is cleaned up no matter
|
||||
the reason for the exit
|
||||
"""
|
||||
if "Accept" in self._session.headers:
|
||||
del self._session.headers["Accept"]
|
||||
if "Authorization" in self._session.headers:
|
||||
del self._session.headers["Authorization"]
|
||||
|
||||
# Close the session as well
|
||||
self._session.close()
|
||||
self._session = None
|
||||
|
||||
def _read_all_pages(self, endpoint):
|
||||
"""
|
||||
Helper function to read all pages of an endpoint, utilizing the
|
||||
next.url until exhausted. Assumes the endpoint returns a list
|
||||
"""
|
||||
internal_data = []
|
||||
|
||||
while True:
|
||||
resp = self._session.get(endpoint)
|
||||
if resp.status_code == 200:
|
||||
internal_data += resp.json()
|
||||
if "next" in resp.links:
|
||||
endpoint = resp.links["next"]["url"]
|
||||
else:
|
||||
logger.debug("Exiting pagination loop")
|
||||
break
|
||||
else:
|
||||
logger.warning(f"Request to {endpoint} return HTTP {resp.status_code}")
|
||||
break
|
||||
|
||||
return internal_data
|
||||
|
||||
|
||||
class _EndpointResponse:
|
||||
"""
|
||||
For all endpoint JSON responses, store the full
|
||||
response data, for ease of extending later, if need be.
|
||||
"""
|
||||
|
||||
def __init__(self, data: Dict) -> None:
|
||||
self._data = data
|
||||
|
||||
|
||||
class GithubBranch(_EndpointResponse):
|
||||
"""
|
||||
Simple wrapper for a repository branch, only extracts name information
|
||||
for now.
|
||||
"""
|
||||
|
||||
def __init__(self, data: Dict) -> None:
|
||||
super().__init__(data)
|
||||
self.name = self._data["name"]
|
||||
|
||||
|
||||
class GithubBranchApi(_GithubApiBase):
|
||||
"""
|
||||
Wrapper around branch API.
|
||||
|
||||
See https://docs.github.com/en/rest/branches/branches
|
||||
|
||||
"""
|
||||
|
||||
def __init__(self, token: str) -> None:
|
||||
super().__init__(token)
|
||||
|
||||
self._ENDPOINT = "https://api.github.com/repos/{REPO}/branches"
|
||||
|
||||
def get_branches(self, repo: str) -> List[GithubBranch]:
|
||||
"""
|
||||
Returns all current branches of the given repository owned by the given
|
||||
owner or organization.
|
||||
"""
|
||||
endpoint = self._ENDPOINT.format(REPO=repo)
|
||||
internal_data = self._read_all_pages(endpoint)
|
||||
return [GithubBranch(branch) for branch in internal_data]
|
||||
|
||||
|
||||
class ContainerPackage(_EndpointResponse):
|
||||
"""
|
||||
Data class wrapping the JSON response from the package related
|
||||
endpoints
|
||||
"""
|
||||
|
||||
def __init__(self, data: Dict):
|
||||
super().__init__(data)
|
||||
# This is a numerical ID, required for interactions with this
|
||||
# specific package, including deletion of it or restoration
|
||||
self.id: int = self._data["id"]
|
||||
|
||||
# A string name. This might be an actual name or it could be a
|
||||
# digest string like "sha256:"
|
||||
self.name: str = self._data["name"]
|
||||
|
||||
# URL to the package, including its ID, can be used for deletion
|
||||
# or restoration without needing to build up a URL ourselves
|
||||
self.url: str = self._data["url"]
|
||||
|
||||
# The list of tags applied to this image. Maybe an empty list
|
||||
self.tags: List[str] = self._data["metadata"]["container"]["tags"]
|
||||
|
||||
@functools.cached_property
|
||||
def untagged(self) -> bool:
|
||||
"""
|
||||
Returns True if the image has no tags applied to it, False otherwise
|
||||
"""
|
||||
return len(self.tags) == 0
|
||||
|
||||
@functools.cache
|
||||
def tag_matches(self, pattern: str) -> bool:
|
||||
"""
|
||||
Returns True if the image has at least one tag which matches the given regex,
|
||||
False otherwise
|
||||
"""
|
||||
for tag in self.tags:
|
||||
if re.match(pattern, tag) is not None:
|
||||
return True
|
||||
return False
|
||||
|
||||
def __repr__(self):
|
||||
return f"Package {self.name}"
|
||||
|
||||
|
||||
class GithubContainerRegistryApi(_GithubApiBase):
|
||||
"""
|
||||
Class wrapper to deal with the Github packages API. This class only deals with
|
||||
container type packages, the only type published by paperless-ngx.
|
||||
"""
|
||||
|
||||
def __init__(self, token: str, owner_or_org: str) -> None:
|
||||
super().__init__(token)
|
||||
self._owner_or_org = owner_or_org
|
||||
if self._owner_or_org == "paperless-ngx":
|
||||
# https://docs.github.com/en/rest/packages#get-all-package-versions-for-a-package-owned-by-an-organization
|
||||
self._PACKAGES_VERSIONS_ENDPOINT = "https://api.github.com/orgs/{ORG}/packages/{PACKAGE_TYPE}/{PACKAGE_NAME}/versions"
|
||||
# https://docs.github.com/en/rest/packages#delete-package-version-for-an-organization
|
||||
self._PACKAGE_VERSION_DELETE_ENDPOINT = "https://api.github.com/orgs/{ORG}/packages/{PACKAGE_TYPE}/{PACKAGE_NAME}/versions/{PACKAGE_VERSION_ID}"
|
||||
else:
|
||||
# https://docs.github.com/en/rest/packages#get-all-package-versions-for-a-package-owned-by-the-authenticated-user
|
||||
self._PACKAGES_VERSIONS_ENDPOINT = "https://api.github.com/user/packages/{PACKAGE_TYPE}/{PACKAGE_NAME}/versions"
|
||||
# https://docs.github.com/en/rest/packages#delete-a-package-version-for-the-authenticated-user
|
||||
self._PACKAGE_VERSION_DELETE_ENDPOINT = "https://api.github.com/user/packages/{PACKAGE_TYPE}/{PACKAGE_NAME}/versions/{PACKAGE_VERSION_ID}"
|
||||
|
||||
def get_package_versions(
|
||||
self,
|
||||
package_name: str,
|
||||
) -> List[ContainerPackage]:
|
||||
"""
|
||||
Returns all the versions of a given package (container images) from
|
||||
the API
|
||||
"""
|
||||
|
||||
package_type: str = "container"
|
||||
# Need to quote this for slashes in the name
|
||||
package_name = urllib.parse.quote(package_name, safe="")
|
||||
|
||||
endpoint = self._PACKAGES_VERSIONS_ENDPOINT.format(
|
||||
ORG=self._owner_or_org,
|
||||
PACKAGE_TYPE=package_type,
|
||||
PACKAGE_NAME=package_name,
|
||||
)
|
||||
|
||||
pkgs = []
|
||||
|
||||
for data in self._read_all_pages(endpoint):
|
||||
pkgs.append(ContainerPackage(data))
|
||||
|
||||
return pkgs
|
||||
|
||||
def delete_package_version(self, package_data: ContainerPackage):
|
||||
"""
|
||||
Deletes the given package version from the GHCR
|
||||
"""
|
||||
resp = self._session.delete(package_data.url)
|
||||
if resp.status_code != 204:
|
||||
logger.warning(
|
||||
f"Request to delete {package_data.url} returned HTTP {resp.status_code}",
|
||||
)
|
23
.github/stale.yml
vendored
Normal file
@@ -0,0 +1,23 @@
|
||||
# Number of days of inactivity before an issue becomes stale
|
||||
daysUntilStale: 30
|
||||
|
||||
# Number of days of inactivity before a stale issue is closed
|
||||
daysUntilClose: 7
|
||||
|
||||
# Only issues or pull requests with all of these labels are check if stale. Defaults to `[]` (disabled)
|
||||
onlyLabels: [cant-reproduce]
|
||||
|
||||
# Label to use when marking an issue as stale
|
||||
staleLabel: stale
|
||||
|
||||
# Comment to post when marking an issue as stale. Set to `false` to disable
|
||||
markComment: >
|
||||
This issue has been automatically marked as stale because it has not had
|
||||
recent activity. It will be closed if no further activity occurs. Thank you
|
||||
for your contributions.
|
||||
|
||||
# Comment to post when closing a stale issue. Set to `false` to disable
|
||||
closeComment: false
|
||||
|
||||
# See https://github.com/marketplace/stale for more info on the app
|
||||
# and https://github.com/probot/stale for the configuration docs
|
37
.github/workflow-scripts/check-trailing-newline
vendored
@@ -1,37 +0,0 @@
|
||||
#!/bin/bash
|
||||
|
||||
# Verify that all text files end in a trailing newline.
|
||||
|
||||
# Exit on first failing command.
|
||||
set -e
|
||||
|
||||
# Exit on unset variable.
|
||||
set -u
|
||||
|
||||
success=0
|
||||
|
||||
function is_plaintext_file() {
|
||||
local file="$1"
|
||||
if [[ $file == *.svg ]]; then
|
||||
echo ""
|
||||
return
|
||||
fi
|
||||
file --brief "${file}" | grep text
|
||||
}
|
||||
|
||||
# Split strings on newlines.
|
||||
IFS='
|
||||
'
|
||||
for file in $(git ls-files)
|
||||
do
|
||||
if [[ -z $(is_plaintext_file "${file}") ]]; then
|
||||
continue
|
||||
fi
|
||||
|
||||
if ! [[ -z "$(tail -c 1 "${file}")" ]]; then
|
||||
printf "File must end in a trailing newline: %s\n" "${file}" >&2
|
||||
success=255
|
||||
fi
|
||||
done
|
||||
|
||||
exit "${success}"
|
@@ -1,26 +0,0 @@
|
||||
#!/bin/bash
|
||||
|
||||
# Check for trailing whitespace at end of lines.
|
||||
|
||||
# Exit on first failing command.
|
||||
set -e
|
||||
# Exit on unset variable.
|
||||
set -u
|
||||
|
||||
FOUND_TRAILING_WHITESPACE=0
|
||||
|
||||
while read -r line; do
|
||||
if grep \
|
||||
"\s$" \
|
||||
--line-number \
|
||||
--with-filename \
|
||||
--binary-files=without-match \
|
||||
--exclude="*.svg" \
|
||||
--exclude="*.eps" \
|
||||
"${line}"; then
|
||||
echo "ERROR: Found trailing whitespace" >&2;
|
||||
FOUND_TRAILING_WHITESPACE=1
|
||||
fi
|
||||
done < <(git ls-files)
|
||||
|
||||
exit "${FOUND_TRAILING_WHITESPACE}"
|
78
.github/workflows/ansible.yml
vendored
@@ -1,78 +0,0 @@
|
||||
---
|
||||
name: Ansible Role
|
||||
|
||||
on:
|
||||
push:
|
||||
branches-ignore:
|
||||
- 'translations**'
|
||||
pull_request:
|
||||
branches-ignore:
|
||||
- 'translations**'
|
||||
|
||||
jobs:
|
||||
# https://molecule.readthedocs.io/en/latest/ci.html#github-actions
|
||||
test:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: Check out the codebase
|
||||
uses: actions/checkout@v2
|
||||
if: github.event_name != 'pull_request'
|
||||
with:
|
||||
path: "${{ github.repository }}"
|
||||
- name: Check out the codebase
|
||||
uses: actions/checkout@v2
|
||||
if: github.event_name == 'pull_request'
|
||||
with:
|
||||
# merge commit is not available from tree at this point in time
|
||||
# https://github.com/actions/checkout#checkout-pull-request-head-commit-instead-of-merge-commit
|
||||
ref: "${{ github.event.pull_request.head.sha }}"
|
||||
path: "${{ github.repository }}"
|
||||
- name: Set up Python
|
||||
uses: actions/setup-python@v2
|
||||
- name: Set up Docker
|
||||
uses: docker-practice/actions-setup-docker@master
|
||||
- name: Install dependencies
|
||||
run: |
|
||||
python3 -m pip install --upgrade pip
|
||||
python3 -m pip install molecule[ansible,docker] jmespath
|
||||
ansible --version
|
||||
docker --version
|
||||
molecule --version
|
||||
python --version
|
||||
- name: Test installation/build/upgrade with molecule
|
||||
run: |
|
||||
cd ansible
|
||||
molecule create
|
||||
molecule verify
|
||||
molecule converge
|
||||
molecule idempotence
|
||||
molecule verify
|
||||
working-directory: "${{ github.repository }}"
|
||||
env:
|
||||
TARGET_GITHUB_SHA: "${{ github.event.pull_request.head.sha }}"
|
||||
# # https://galaxy.ansible.com/docs/contributing/importing.html
|
||||
# release:
|
||||
# runs-on: ubuntu-latest
|
||||
# needs:
|
||||
# - test
|
||||
# # https://docs.github.com/en/free-pro-team@latest/actions/reference/context-and-expression-syntax-for-github-actions#github-context
|
||||
# if: contains(github.ref, 'refs/tags/')
|
||||
# steps:
|
||||
# - name: Check out the codebase
|
||||
# uses: actions/checkout@v2
|
||||
# with:
|
||||
# path: "${{ github.repository }}"
|
||||
# - name: Set up Python
|
||||
# uses: actions/setup-python@v2
|
||||
# - name: Install dependencies
|
||||
# run: |
|
||||
# python3 -m pip install --upgrade ansible-base
|
||||
# ansible --version
|
||||
# python --version
|
||||
# - name: Trigger a new import on Galaxy
|
||||
# # TODO Check if source if pulled from cwd or imported from github
|
||||
# # https://github.com/ansible/ansible/blob/devel/lib/ansible/cli/galaxy.py
|
||||
# run: |
|
||||
# cd ansible
|
||||
# ansible-galaxy role import --api-key ${{ secrets.GALAXY_API_KEY }} $(echo ${{ github.repository }} | cut -d/ -f1) $(echo ${{ github.repository }} | cut -d/ -f2)
|
||||
# working-directory: "${{ github.repository }}"
|
607
.github/workflows/ci.yml
vendored
@@ -2,7 +2,11 @@ name: ci
|
||||
|
||||
on:
|
||||
push:
|
||||
tags: ng-*
|
||||
tags:
|
||||
# https://semver.org/#spec-item-2
|
||||
- 'v[0-9]+.[0-9]+.[0-9]+'
|
||||
# https://semver.org/#spec-item-9
|
||||
- 'v[0-9]+.[0-9]+.[0-9]+-beta.rc[0-9]+'
|
||||
branches-ignore:
|
||||
- 'translations**'
|
||||
pull_request:
|
||||
@@ -10,241 +14,428 @@ on:
|
||||
- 'translations**'
|
||||
|
||||
jobs:
|
||||
pre-commit:
|
||||
name: Linting Checks
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
-
|
||||
name: Checkout repository
|
||||
uses: actions/checkout@v3
|
||||
|
||||
-
|
||||
name: Install tools
|
||||
uses: actions/setup-python@v4
|
||||
with:
|
||||
python-version: "3.9"
|
||||
|
||||
-
|
||||
name: Check files
|
||||
uses: pre-commit/action@v3.0.0
|
||||
|
||||
documentation:
|
||||
name: "Build Documentation"
|
||||
runs-on: ubuntu-20.04
|
||||
needs:
|
||||
- pre-commit
|
||||
steps:
|
||||
-
|
||||
name: Checkout
|
||||
uses: actions/checkout@v2
|
||||
uses: actions/checkout@v3
|
||||
-
|
||||
name: Install pipenv
|
||||
run: |
|
||||
pipx install pipenv==2022.8.5
|
||||
pipenv --version
|
||||
-
|
||||
name: Set up Python
|
||||
uses: actions/setup-python@v2
|
||||
uses: actions/setup-python@v4
|
||||
with:
|
||||
python-version: 3.9
|
||||
-
|
||||
name: Get pip cache dir
|
||||
id: pip-cache
|
||||
run: |
|
||||
echo "::set-output name=dir::$(pip cache dir)"
|
||||
-
|
||||
name: Persistent Github pip cache
|
||||
uses: actions/cache@v2
|
||||
with:
|
||||
path: ${{ steps.pip-cache.outputs.dir }}
|
||||
key: ${{ runner.os }}-pip3.8}
|
||||
cache: "pipenv"
|
||||
cache-dependency-path: 'Pipfile.lock'
|
||||
-
|
||||
name: Install dependencies
|
||||
run: |
|
||||
pip install --upgrade pipenv
|
||||
pipenv install --system --dev --ignore-pipfile
|
||||
pipenv sync --dev
|
||||
-
|
||||
name: List installed Python dependencies
|
||||
run: |
|
||||
pipenv run pip list
|
||||
-
|
||||
name: Make documentation
|
||||
run: |
|
||||
cd docs/
|
||||
make html
|
||||
pipenv run make html
|
||||
-
|
||||
name: Upload artifact
|
||||
uses: actions/upload-artifact@v2
|
||||
uses: actions/upload-artifact@v3
|
||||
with:
|
||||
name: documentation
|
||||
path: docs/_build/html/
|
||||
|
||||
codestyle:
|
||||
runs-on: ubuntu-20.04
|
||||
steps:
|
||||
-
|
||||
name: Checkout
|
||||
uses: actions/checkout@v2
|
||||
-
|
||||
name: Set up Python
|
||||
uses: actions/setup-python@v2
|
||||
with:
|
||||
python-version: 3.9
|
||||
-
|
||||
name: Get pip cache dir
|
||||
id: pip-cache
|
||||
run: |
|
||||
echo "::set-output name=dir::$(pip cache dir)"
|
||||
-
|
||||
name: Persistent Github pip cache
|
||||
uses: actions/cache@v2
|
||||
with:
|
||||
path: ${{ steps.pip-cache.outputs.dir }}
|
||||
key: ${{ runner.os }}-pip${{ matrix.python-version }}
|
||||
-
|
||||
name: Install dependencies
|
||||
run: |
|
||||
pip install --upgrade pipenv
|
||||
pipenv install --system --dev --ignore-pipfile
|
||||
-
|
||||
name: Codestyle
|
||||
run: |
|
||||
cd src/
|
||||
pycodestyle
|
||||
whitespace:
|
||||
runs-on: ubuntu-20.04
|
||||
steps:
|
||||
-
|
||||
name: Checkout
|
||||
uses: actions/checkout@v2
|
||||
-
|
||||
name: Ensure there are no trailing spaces
|
||||
run: |
|
||||
.github/workflow-scripts/check-trailing-whitespace
|
||||
-
|
||||
name: Ensure all text files end with a trailing newline
|
||||
run: |
|
||||
.github/workflow-scripts/check-trailing-whitespace
|
||||
|
||||
tests:
|
||||
tests-backend:
|
||||
name: "Tests (${{ matrix.python-version }})"
|
||||
runs-on: ubuntu-20.04
|
||||
needs:
|
||||
- pre-commit
|
||||
strategy:
|
||||
matrix:
|
||||
python-version: ['3.7', '3.8', '3.9']
|
||||
python-version: ['3.8', '3.9', '3.10']
|
||||
fail-fast: false
|
||||
steps:
|
||||
-
|
||||
name: Checkout
|
||||
uses: actions/checkout@v2
|
||||
uses: actions/checkout@v3
|
||||
with:
|
||||
fetch-depth: 2
|
||||
-
|
||||
name: Install pipenv
|
||||
run: |
|
||||
pipx install pipenv==2022.8.5
|
||||
pipenv --version
|
||||
-
|
||||
name: Set up Python
|
||||
uses: actions/setup-python@v2
|
||||
uses: actions/setup-python@v4
|
||||
with:
|
||||
python-version: "${{ matrix.python-version }}"
|
||||
cache: "pipenv"
|
||||
cache-dependency-path: 'Pipfile.lock'
|
||||
-
|
||||
name: Get pip cache dir
|
||||
id: pip-cache
|
||||
run: |
|
||||
echo "::set-output name=dir::$(pip cache dir)"
|
||||
-
|
||||
name: Persistent Github pip cache
|
||||
uses: actions/cache@v2
|
||||
with:
|
||||
path: ${{ steps.pip-cache.outputs.dir }}
|
||||
key: ${{ runner.os }}-pip${{ matrix.python-version }}
|
||||
-
|
||||
name: Install dependencies
|
||||
name: Install system dependencies
|
||||
run: |
|
||||
sudo apt-get update -qq
|
||||
sudo apt-get install -qq --no-install-recommends unpaper tesseract-ocr imagemagick ghostscript optipng
|
||||
pip install --upgrade pipenv
|
||||
pipenv install --system --dev --ignore-pipfile
|
||||
sudo apt-get install -qq --no-install-recommends unpaper tesseract-ocr imagemagick ghostscript libzbar0 poppler-utils
|
||||
-
|
||||
name: Install Python dependencies
|
||||
run: |
|
||||
pipenv sync --dev
|
||||
-
|
||||
name: List installed Python dependencies
|
||||
run: |
|
||||
pipenv run pip list
|
||||
-
|
||||
name: Tests
|
||||
run: |
|
||||
cd src/
|
||||
pytest
|
||||
pipenv run pytest
|
||||
-
|
||||
name: Get changed files
|
||||
id: changed-files-specific
|
||||
uses: tj-actions/changed-files@v29.0.2
|
||||
with:
|
||||
files: |
|
||||
src/**
|
||||
-
|
||||
name: List all changed files
|
||||
run: |
|
||||
for file in ${{ steps.changed-files-specific.outputs.all_changed_files }}; do
|
||||
echo "${file} was changed"
|
||||
done
|
||||
-
|
||||
name: Publish coverage results
|
||||
if: matrix.python-version == '3.9'
|
||||
if: matrix.python-version == '3.9' && steps.changed-files-specific.outputs.any_changed == 'true'
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
# https://github.com/coveralls-clients/coveralls-python/issues/251
|
||||
run: |
|
||||
cd src/
|
||||
coveralls --service=github
|
||||
pipenv run coveralls --service=github
|
||||
|
||||
frontend:
|
||||
tests-frontend:
|
||||
name: "Tests Frontend"
|
||||
runs-on: ubuntu-20.04
|
||||
needs:
|
||||
- pre-commit
|
||||
strategy:
|
||||
matrix:
|
||||
node-version: [16.x]
|
||||
steps:
|
||||
-
|
||||
name: Checkout
|
||||
uses: actions/checkout@v2
|
||||
-
|
||||
uses: actions/setup-node@v2
|
||||
with:
|
||||
node-version: '15'
|
||||
-
|
||||
name: Configure version on dev branches
|
||||
if: startsWith(github.ref, 'refs/heads/feature-') || github.ref == 'refs/heads/dev'
|
||||
run: |
|
||||
git_hash=$(git rev-parse --short "$GITHUB_SHA")
|
||||
git_branch=${GITHUB_REF#refs/heads/}
|
||||
sed -i -E "s/version: \"(.*)\"/version: \"${git_branch} ${git_hash}\"/g" src-ui/src/environments/environment.prod.ts
|
||||
-
|
||||
name: Build frontend
|
||||
run: ./compile-frontend.sh
|
||||
-
|
||||
name: Upload artifact
|
||||
uses: actions/upload-artifact@v2
|
||||
with:
|
||||
name: frontend-compiled
|
||||
path: src/documents/static/frontend/
|
||||
- uses: actions/checkout@v3
|
||||
-
|
||||
name: Use Node.js ${{ matrix.node-version }}
|
||||
uses: actions/setup-node@v3
|
||||
with:
|
||||
node-version: ${{ matrix.node-version }}
|
||||
- run: cd src-ui && npm ci
|
||||
- run: cd src-ui && npm run test
|
||||
- run: cd src-ui && npm run e2e:ci
|
||||
|
||||
prepare-docker-build:
|
||||
name: Prepare Docker Pipeline Data
|
||||
if: github.event_name == 'push' && (startsWith(github.ref, 'refs/heads/feature-') || github.ref == 'refs/heads/dev' || github.ref == 'refs/heads/beta' || contains(github.ref, 'beta.rc') || startsWith(github.ref, 'refs/tags/v'))
|
||||
runs-on: ubuntu-20.04
|
||||
# If the push triggered the installer library workflow, wait for it to
|
||||
# complete here. This ensures the required versions for the final
|
||||
# image have been built, while not waiting at all if the versions haven't changed
|
||||
concurrency:
|
||||
group: build-installer-library
|
||||
cancel-in-progress: false
|
||||
needs:
|
||||
- documentation
|
||||
- tests-backend
|
||||
- tests-frontend
|
||||
steps:
|
||||
-
|
||||
name: Set ghcr repository name
|
||||
id: set-ghcr-repository
|
||||
run: |
|
||||
ghcr_name=$(echo "${GITHUB_REPOSITORY}" | awk '{ print tolower($0) }')
|
||||
echo ::set-output name=repository::${ghcr_name}
|
||||
-
|
||||
name: Checkout
|
||||
uses: actions/checkout@v3
|
||||
-
|
||||
name: Set up Python
|
||||
uses: actions/setup-python@v4
|
||||
with:
|
||||
python-version: "3.9"
|
||||
-
|
||||
name: Setup qpdf image
|
||||
id: qpdf-setup
|
||||
run: |
|
||||
build_json=$(python ${GITHUB_WORKSPACE}/.github/scripts/get-build-json.py qpdf)
|
||||
|
||||
echo ${build_json}
|
||||
|
||||
echo ::set-output name=qpdf-json::${build_json}
|
||||
-
|
||||
name: Setup psycopg2 image
|
||||
id: psycopg2-setup
|
||||
run: |
|
||||
build_json=$(python ${GITHUB_WORKSPACE}/.github/scripts/get-build-json.py psycopg2)
|
||||
|
||||
echo ${build_json}
|
||||
|
||||
echo ::set-output name=psycopg2-json::${build_json}
|
||||
-
|
||||
name: Setup pikepdf image
|
||||
id: pikepdf-setup
|
||||
run: |
|
||||
build_json=$(python ${GITHUB_WORKSPACE}/.github/scripts/get-build-json.py pikepdf)
|
||||
|
||||
echo ${build_json}
|
||||
|
||||
echo ::set-output name=pikepdf-json::${build_json}
|
||||
-
|
||||
name: Setup jbig2enc image
|
||||
id: jbig2enc-setup
|
||||
run: |
|
||||
build_json=$(python ${GITHUB_WORKSPACE}/.github/scripts/get-build-json.py jbig2enc)
|
||||
|
||||
echo ${build_json}
|
||||
|
||||
echo ::set-output name=jbig2enc-json::${build_json}
|
||||
|
||||
outputs:
|
||||
|
||||
ghcr-repository: ${{ steps.set-ghcr-repository.outputs.repository }}
|
||||
|
||||
qpdf-json: ${{ steps.qpdf-setup.outputs.qpdf-json }}
|
||||
|
||||
pikepdf-json: ${{ steps.pikepdf-setup.outputs.pikepdf-json }}
|
||||
|
||||
psycopg2-json: ${{ steps.psycopg2-setup.outputs.psycopg2-json }}
|
||||
|
||||
jbig2enc-json: ${{ steps.jbig2enc-setup.outputs.jbig2enc-json}}
|
||||
|
||||
# build and push image to docker hub.
|
||||
build-docker-image:
|
||||
runs-on: ubuntu-20.04
|
||||
concurrency:
|
||||
group: ${{ github.workflow }}-build-docker-image-${{ github.ref_name }}
|
||||
cancel-in-progress: true
|
||||
needs:
|
||||
- prepare-docker-build
|
||||
steps:
|
||||
-
|
||||
name: Check pushing to Docker Hub
|
||||
id: docker-hub
|
||||
# Only push to Dockerhub from the main repo AND the ref is either:
|
||||
# main
|
||||
# dev
|
||||
# beta
|
||||
# a tag
|
||||
# Otherwise forks would require a Docker Hub account and secrets setup
|
||||
run: |
|
||||
if [[ ${{ needs.prepare-docker-build.outputs.ghcr-repository }} == "paperless-ngx/paperless-ngx" && ( ${{ github.ref_name }} == "main" || ${{ github.ref_name }} == "dev" || ${{ github.ref_name }} == "beta" || ${{ startsWith(github.ref, 'refs/tags/v') }} == "true" ) ]] ; then
|
||||
echo "Enabling DockerHub image push"
|
||||
echo ::set-output name=enable::"true"
|
||||
else
|
||||
echo "Not pushing to DockerHub"
|
||||
echo ::set-output name=enable::"false"
|
||||
fi
|
||||
-
|
||||
name: Gather Docker metadata
|
||||
id: docker-meta
|
||||
uses: docker/metadata-action@v4
|
||||
with:
|
||||
images: |
|
||||
ghcr.io/${{ needs.prepare-docker-build.outputs.ghcr-repository }}
|
||||
name=paperlessngx/paperless-ngx,enable=${{ steps.docker-hub.outputs.enable }}
|
||||
tags: |
|
||||
# Tag branches with branch name
|
||||
type=ref,event=branch
|
||||
# Process semver tags
|
||||
# For a tag x.y.z or vX.Y.Z, output an x.y.z and x.y image tag
|
||||
type=semver,pattern={{version}}
|
||||
type=semver,pattern={{major}}.{{minor}}
|
||||
-
|
||||
name: Checkout
|
||||
uses: actions/checkout@v3
|
||||
-
|
||||
name: Set up Docker Buildx
|
||||
uses: docker/setup-buildx-action@v2
|
||||
-
|
||||
name: Set up QEMU
|
||||
uses: docker/setup-qemu-action@v2
|
||||
-
|
||||
name: Login to Github Container Registry
|
||||
uses: docker/login-action@v2
|
||||
with:
|
||||
registry: ghcr.io
|
||||
username: ${{ github.actor }}
|
||||
password: ${{ secrets.GITHUB_TOKEN }}
|
||||
-
|
||||
name: Login to Docker Hub
|
||||
uses: docker/login-action@v2
|
||||
# Don't attempt to login is not pushing to Docker Hub
|
||||
if: steps.docker-hub.outputs.enable == 'true'
|
||||
with:
|
||||
username: ${{ secrets.DOCKERHUB_USERNAME }}
|
||||
password: ${{ secrets.DOCKERHUB_TOKEN }}
|
||||
-
|
||||
name: Build and push
|
||||
uses: docker/build-push-action@v3
|
||||
with:
|
||||
context: .
|
||||
file: ./Dockerfile
|
||||
platforms: linux/amd64,linux/arm/v7,linux/arm64
|
||||
push: ${{ github.event_name != 'pull_request' }}
|
||||
tags: ${{ steps.docker-meta.outputs.tags }}
|
||||
labels: ${{ steps.docker-meta.outputs.labels }}
|
||||
build-args: |
|
||||
JBIG2ENC_VERSION=${{ fromJSON(needs.prepare-docker-build.outputs.jbig2enc-json).version }}
|
||||
QPDF_VERSION=${{ fromJSON(needs.prepare-docker-build.outputs.qpdf-json).version }}
|
||||
PIKEPDF_VERSION=${{ fromJSON(needs.prepare-docker-build.outputs.pikepdf-json).version }}
|
||||
PSYCOPG2_VERSION=${{ fromJSON(needs.prepare-docker-build.outputs.psycopg2-json).version }}
|
||||
# Get cache layers from this branch, then dev, then main
|
||||
# This allows new branches to get at least some cache benefits, generally from dev
|
||||
cache-from: |
|
||||
type=registry,ref=ghcr.io/${{ needs.prepare-docker-build.outputs.ghcr-repository }}/builder/cache/app:${{ github.ref_name }}
|
||||
type=registry,ref=ghcr.io/${{ needs.prepare-docker-build.outputs.ghcr-repository }}/builder/cache/app:dev
|
||||
type=registry,ref=ghcr.io/${{ needs.prepare-docker-build.outputs.ghcr-repository }}/builder/cache/app:main
|
||||
cache-to: |
|
||||
type=registry,mode=max,ref=ghcr.io/${{ needs.prepare-docker-build.outputs.ghcr-repository }}/builder/cache/app:${{ github.ref_name }}
|
||||
-
|
||||
name: Inspect image
|
||||
run: |
|
||||
docker buildx imagetools inspect ${{ fromJSON(steps.docker-meta.outputs.json).tags[0] }}
|
||||
-
|
||||
name: Export frontend artifact from docker
|
||||
run: |
|
||||
docker create --name frontend-extract ${{ fromJSON(steps.docker-meta.outputs.json).tags[0] }}
|
||||
docker cp frontend-extract:/usr/src/paperless/src/documents/static/frontend src/documents/static/frontend/
|
||||
-
|
||||
name: Upload frontend artifact
|
||||
uses: actions/upload-artifact@v3
|
||||
with:
|
||||
name: frontend-compiled
|
||||
path: src/documents/static/frontend/
|
||||
|
||||
build-release:
|
||||
needs: [frontend, documentation, tests, whitespace, codestyle]
|
||||
needs:
|
||||
- build-docker-image
|
||||
runs-on: ubuntu-20.04
|
||||
steps:
|
||||
-
|
||||
name: Checkout
|
||||
uses: actions/checkout@v2
|
||||
uses: actions/checkout@v3
|
||||
-
|
||||
name: Install pipenv
|
||||
run: |
|
||||
pip3 install --upgrade pip setuptools wheel pipx
|
||||
pipx install pipenv
|
||||
-
|
||||
name: Set up Python
|
||||
uses: actions/setup-python@v2
|
||||
uses: actions/setup-python@v4
|
||||
with:
|
||||
python-version: 3.9
|
||||
cache: "pipenv"
|
||||
cache-dependency-path: 'Pipfile.lock'
|
||||
-
|
||||
name: Install dependencies
|
||||
name: Install Python dependencies
|
||||
run: |
|
||||
pipenv sync --dev
|
||||
-
|
||||
name: Install system dependencies
|
||||
run: |
|
||||
sudo apt-get update -qq
|
||||
sudo apt-get install -qq --no-install-recommends gettext liblept5
|
||||
pip3 install -r requirements.txt
|
||||
-
|
||||
name: Download frontend artifact
|
||||
uses: actions/download-artifact@v2
|
||||
uses: actions/download-artifact@v3
|
||||
with:
|
||||
name: frontend-compiled
|
||||
path: src/documents/static/frontend/
|
||||
-
|
||||
name: Download documentation artifact
|
||||
uses: actions/download-artifact@v2
|
||||
uses: actions/download-artifact@v3
|
||||
with:
|
||||
name: documentation
|
||||
path: docs/_build/html/
|
||||
-
|
||||
name: Move files
|
||||
name: Generate requirements file
|
||||
run: |
|
||||
mkdir dist
|
||||
mkdir dist/paperless-ng
|
||||
mkdir dist/paperless-ng/scripts
|
||||
cp .dockerignore .env Dockerfile Pipfile Pipfile.lock LICENSE README.md requirements.txt dist/paperless-ng/
|
||||
cp paperless.conf.example dist/paperless-ng/paperless.conf
|
||||
cp gunicorn.conf.py dist/paperless-ng/gunicorn.conf.py
|
||||
cp docker/ dist/paperless-ng/docker -r
|
||||
cp scripts/*.service scripts/*.sh dist/paperless-ng/scripts/
|
||||
cp src/ dist/paperless-ng/src -r
|
||||
cp docs/_build/html/ dist/paperless-ng/docs -r
|
||||
pipenv requirements > requirements.txt
|
||||
-
|
||||
name: Compile messages
|
||||
run: |
|
||||
cd dist/paperless-ng/src
|
||||
python3 manage.py compilemessages
|
||||
cd src/
|
||||
pipenv run python3 manage.py compilemessages
|
||||
-
|
||||
name: Collect static files
|
||||
run: |
|
||||
cd dist/paperless-ng/src
|
||||
python3 manage.py collectstatic --no-input
|
||||
cd src/
|
||||
pipenv run python3 manage.py collectstatic --no-input
|
||||
-
|
||||
name: Move files
|
||||
run: |
|
||||
mkdir dist
|
||||
mkdir dist/paperless-ngx
|
||||
mkdir dist/paperless-ngx/scripts
|
||||
cp .dockerignore .env Dockerfile Pipfile Pipfile.lock requirements.txt LICENSE README.md dist/paperless-ngx/
|
||||
cp paperless.conf.example dist/paperless-ngx/paperless.conf
|
||||
cp gunicorn.conf.py dist/paperless-ngx/gunicorn.conf.py
|
||||
cp -r docker/ dist/paperless-ngx/docker
|
||||
cp scripts/*.service scripts/*.sh dist/paperless-ngx/scripts/
|
||||
cp -r src/ dist/paperless-ngx/src
|
||||
cp -r docs/_build/html/ dist/paperless-ngx/docs
|
||||
mv static dist/paperless-ngx
|
||||
-
|
||||
name: Make release package
|
||||
run: |
|
||||
cd dist
|
||||
find . -name __pycache__ | xargs rm -r
|
||||
tar -cJf paperless-ng.tar.xz paperless-ng/
|
||||
tar -cJf paperless-ngx.tar.xz paperless-ngx/
|
||||
-
|
||||
name: Upload release artifact
|
||||
uses: actions/upload-artifact@v2
|
||||
uses: actions/upload-artifact@v3
|
||||
with:
|
||||
name: release
|
||||
path: dist/paperless-ng.tar.xz
|
||||
path: dist/paperless-ngx.tar.xz
|
||||
|
||||
publish-release:
|
||||
runs-on: ubuntu-latest
|
||||
needs: build-release
|
||||
if: contains(github.ref, 'refs/tags/ng-')
|
||||
runs-on: ubuntu-20.04
|
||||
outputs:
|
||||
prerelease: ${{ steps.get_version.outputs.prerelease }}
|
||||
changelog: ${{ steps.create-release.outputs.body }}
|
||||
version: ${{ steps.get_version.outputs.version }}
|
||||
needs:
|
||||
- build-release
|
||||
if: github.ref_type == 'tag' && (startsWith(github.ref_name, 'v') || contains(github.ref_name, '-beta.rc'))
|
||||
steps:
|
||||
-
|
||||
name: Download release artifact
|
||||
uses: actions/download-artifact@v2
|
||||
uses: actions/download-artifact@v3
|
||||
with:
|
||||
name: release
|
||||
path: ./
|
||||
@@ -252,20 +443,24 @@ jobs:
|
||||
name: Get version
|
||||
id: get_version
|
||||
run: |
|
||||
echo ::set-output name=version::${GITHUB_REF#refs/tags/ng-}
|
||||
echo ::set-output name=version::${{ github.ref_name }}
|
||||
if [[ ${{ contains(github.ref_name, '-beta.rc') }} == 'true' ]]; then
|
||||
echo ::set-output name=prerelease::true
|
||||
else
|
||||
echo ::set-output name=prerelease::false
|
||||
fi
|
||||
-
|
||||
name: Create release
|
||||
id: create_release
|
||||
uses: actions/create-release@v1
|
||||
name: Create Release and Changelog
|
||||
id: create-release
|
||||
uses: paperless-ngx/release-drafter@master
|
||||
with:
|
||||
name: Paperless-ngx ${{ steps.get_version.outputs.version }}
|
||||
tag: ${{ steps.get_version.outputs.version }}
|
||||
version: ${{ steps.get_version.outputs.version }}
|
||||
prerelease: ${{ steps.get_version.outputs.prerelease }}
|
||||
publish: true # ensures release is not marked as draft
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
with:
|
||||
tag_name: ng-${{ steps.get_version.outputs.version }}
|
||||
release_name: Paperless-ng ${{ steps.get_version.outputs.version }}
|
||||
draft: false
|
||||
prerelease: false
|
||||
body: |
|
||||
For a complete list of changes, see the changelog at https://paperless-ng.readthedocs.io/en/latest/changelog.html.
|
||||
-
|
||||
name: Upload release archive
|
||||
id: upload-release-asset
|
||||
@@ -273,74 +468,56 @@ jobs:
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
with:
|
||||
upload_url: ${{ steps.create_release.outputs.upload_url }} # This pulls from the CREATE RELEASE step above, referencing it's ID to get its outputs object, which include a `upload_url`. See this blog post for more info: https://jasonet.co/posts/new-features-of-github-actions/#passing-data-to-future-steps
|
||||
asset_path: ./paperless-ng.tar.xz
|
||||
asset_name: paperless-ng-${{ steps.get_version.outputs.version }}.tar.xz
|
||||
upload_url: ${{ steps.create-release.outputs.upload_url }}
|
||||
asset_path: ./paperless-ngx.tar.xz
|
||||
asset_name: paperless-ngx-${{ steps.get_version.outputs.version }}.tar.xz
|
||||
asset_content_type: application/x-xz
|
||||
|
||||
# build and push image to docker hub.
|
||||
build-docker-image:
|
||||
if: github.event_name == 'push' && (startsWith(github.ref, 'refs/heads/feature-') || github.ref == 'refs/heads/dev' || startsWith(github.ref, 'refs/tags/ng-'))
|
||||
runs-on: ubuntu-latest
|
||||
needs: [frontend, tests, whitespace, codestyle]
|
||||
append-changelog:
|
||||
runs-on: ubuntu-20.04
|
||||
needs:
|
||||
- publish-release
|
||||
if: needs.publish-release.outputs.prerelease == 'false'
|
||||
steps:
|
||||
-
|
||||
name: Prepare
|
||||
id: prepare
|
||||
run: |
|
||||
IMAGE_NAME=jonaswinkler/paperless-ng
|
||||
if [[ $GITHUB_REF == refs/tags/ng-* ]]; then
|
||||
TAGS=${IMAGE_NAME}:${GITHUB_REF#refs/tags/ng-},${IMAGE_NAME}:latest
|
||||
INSPECT_TAG=${IMAGE_NAME}:latest
|
||||
elif [[ $GITHUB_REF == refs/heads/* ]]; then
|
||||
TAGS=${IMAGE_NAME}:${GITHUB_REF#refs/heads/}
|
||||
INSPECT_TAG=${TAGS}
|
||||
else
|
||||
exit 1
|
||||
fi
|
||||
echo ::set-output name=tags::${TAGS}
|
||||
echo ::set-output name=inspect_tag::${INSPECT_TAG}
|
||||
-
|
||||
name: Checkout
|
||||
uses: actions/checkout@v2
|
||||
-
|
||||
name: Download frontend artifact
|
||||
uses: actions/download-artifact@v2
|
||||
uses: actions/checkout@v3
|
||||
with:
|
||||
name: frontend-compiled
|
||||
path: src/documents/static/frontend/
|
||||
ref: main
|
||||
-
|
||||
name: Set up Docker Buildx
|
||||
uses: docker/setup-buildx-action@v1
|
||||
-
|
||||
name: Set up QEMU
|
||||
uses: docker/setup-qemu-action@v1
|
||||
-
|
||||
name: Cache Docker layers
|
||||
uses: actions/cache@v2
|
||||
with:
|
||||
path: /tmp/.buildx-cache
|
||||
key: ${{ runner.os }}-buildx-${{ github.sha }}
|
||||
restore-keys: |
|
||||
${{ runner.os }}-buildx-
|
||||
-
|
||||
name: Login to DockerHub
|
||||
uses: docker/login-action@v1
|
||||
with:
|
||||
username: ${{ secrets.DOCKER_USERNAME }}
|
||||
password: ${{ secrets.DOCKER_PASSWORD }}
|
||||
-
|
||||
name: Build and push
|
||||
uses: docker/build-push-action@v2
|
||||
with:
|
||||
context: .
|
||||
file: ./Dockerfile
|
||||
platforms: linux/amd64,linux/arm/v7,linux/arm64
|
||||
push: true
|
||||
tags: ${{ steps.prepare.outputs.tags }}
|
||||
cache-from: type=local,src=/tmp/.buildx-cache
|
||||
cache-to: type=local,dest=/tmp/.buildx-cache
|
||||
-
|
||||
name: Inspect image
|
||||
name: Append Changelog to docs
|
||||
id: append-Changelog
|
||||
working-directory: docs
|
||||
run: |
|
||||
docker buildx imagetools inspect ${{ steps.prepare.outputs.inspect_tag }}
|
||||
git branch ${{ needs.publish-release.outputs.version }}-changelog
|
||||
git checkout ${{ needs.publish-release.outputs.version }}-changelog
|
||||
echo -e "# Changelog\n\n${{ needs.publish-release.outputs.changelog }}\n" > changelog-new.md
|
||||
echo "Manually linking usernames"
|
||||
sed -i -r 's|@(.+?) \(\[#|[@\1](https://github.com/\1) ([#|ig' changelog-new.md
|
||||
CURRENT_CHANGELOG=`tail --lines +2 changelog.md`
|
||||
echo -e "$CURRENT_CHANGELOG" >> changelog-new.md
|
||||
mv changelog-new.md changelog.md
|
||||
git config --global user.name "github-actions"
|
||||
git config --global user.email "41898282+github-actions[bot]@users.noreply.github.com"
|
||||
git commit -am "Changelog ${{ steps.get_version.outputs.version }} - GHA"
|
||||
git push origin ${{ needs.publish-release.outputs.version }}-changelog
|
||||
-
|
||||
name: Create Pull Request
|
||||
uses: actions/github-script@v6
|
||||
with:
|
||||
script: |
|
||||
const { repo, owner } = context.repo;
|
||||
const result = await github.rest.pulls.create({
|
||||
title: '[Documentation] Add ${{ needs.publish-release.outputs.version }} changelog',
|
||||
owner,
|
||||
repo,
|
||||
head: '${{ needs.publish-release.outputs.version }}-changelog',
|
||||
base: 'main',
|
||||
body: 'This PR is auto-generated by CI.'
|
||||
});
|
||||
github.rest.issues.addLabels({
|
||||
owner,
|
||||
repo,
|
||||
issue_number: result.data.number,
|
||||
labels: ['documentation']
|
||||
});
|
||||
|
108
.github/workflows/cleanup-tags.yml
vendored
Normal file
@@ -0,0 +1,108 @@
|
||||
# This workflow runs on certain conditions to check for and potentially
|
||||
# delete container images from the GHCR which no longer have an associated
|
||||
# code branch.
|
||||
# Requires a PAT with the correct scope set in the secrets
|
||||
|
||||
name: Cleanup Image Tags
|
||||
|
||||
on:
|
||||
schedule:
|
||||
- cron: '0 0 * * SAT'
|
||||
delete:
|
||||
pull_request:
|
||||
types:
|
||||
- closed
|
||||
push:
|
||||
paths:
|
||||
- ".github/workflows/cleanup-tags.yml"
|
||||
- ".github/scripts/cleanup-tags.py"
|
||||
- ".github/scripts/github.py"
|
||||
- ".github/scripts/common.py"
|
||||
|
||||
jobs:
|
||||
cleanup:
|
||||
name: Cleanup Image Tags
|
||||
runs-on: ubuntu-20.04
|
||||
env:
|
||||
# Requires a personal access token with the OAuth scope delete:packages
|
||||
TOKEN: ${{ secrets.GHA_CONTAINER_DELETE_TOKEN }}
|
||||
steps:
|
||||
-
|
||||
name: Checkout
|
||||
uses: actions/checkout@v3
|
||||
-
|
||||
name: Login to Github Container Registry
|
||||
uses: docker/login-action@v1
|
||||
with:
|
||||
registry: ghcr.io
|
||||
username: ${{ github.actor }}
|
||||
password: ${{ secrets.GITHUB_TOKEN }}
|
||||
-
|
||||
name: Set up Python
|
||||
uses: actions/setup-python@v3
|
||||
with:
|
||||
python-version: "3.10"
|
||||
-
|
||||
name: Install requests
|
||||
run: |
|
||||
python -m pip install requests
|
||||
# Clean up primary packages
|
||||
-
|
||||
name: Cleanup for package "paperless-ngx"
|
||||
if: "${{ env.TOKEN != '' }}"
|
||||
run: |
|
||||
python ${GITHUB_WORKSPACE}/.github/scripts/cleanup-tags.py --loglevel info --untagged --is-manifest --delete "paperless-ngx"
|
||||
-
|
||||
name: Cleanup for package "qpdf"
|
||||
if: "${{ env.TOKEN != '' }}"
|
||||
run: |
|
||||
python ${GITHUB_WORKSPACE}/.github/scripts/cleanup-tags.py --loglevel info --untagged --is-manifest --delete "paperless-ngx/builder/qpdf"
|
||||
-
|
||||
name: Cleanup for package "pikepdf"
|
||||
if: "${{ env.TOKEN != '' }}"
|
||||
run: |
|
||||
python ${GITHUB_WORKSPACE}/.github/scripts/cleanup-tags.py --loglevel info --untagged --is-manifest --delete "paperless-ngx/builder/pikepdf"
|
||||
-
|
||||
name: Cleanup for package "jbig2enc"
|
||||
if: "${{ env.TOKEN != '' }}"
|
||||
run: |
|
||||
python ${GITHUB_WORKSPACE}/.github/scripts/cleanup-tags.py --loglevel info --untagged --is-manifest --delete "paperless-ngx/builder/jbig2enc"
|
||||
-
|
||||
name: Cleanup for package "psycopg2"
|
||||
if: "${{ env.TOKEN != '' }}"
|
||||
run: |
|
||||
python ${GITHUB_WORKSPACE}/.github/scripts/cleanup-tags.py --loglevel info --untagged --is-manifest --delete "paperless-ngx/builder/psycopg2"
|
||||
#
|
||||
# Clean up registry cache packages
|
||||
#
|
||||
-
|
||||
name: Cleanup for package "builder/cache/app"
|
||||
if: "${{ env.TOKEN != '' }}"
|
||||
run: |
|
||||
python ${GITHUB_WORKSPACE}/.github/scripts/cleanup-tags.py --loglevel info --untagged --delete "paperless-ngx/builder/cache/app"
|
||||
-
|
||||
name: Cleanup for package "builder/cache/qpdf"
|
||||
if: "${{ env.TOKEN != '' }}"
|
||||
run: |
|
||||
python ${GITHUB_WORKSPACE}/.github/scripts/cleanup-tags.py --loglevel info --untagged --delete "paperless-ngx/builder/cache/qpdf"
|
||||
-
|
||||
name: Cleanup for package "builder/cache/psycopg2"
|
||||
if: "${{ env.TOKEN != '' }}"
|
||||
run: |
|
||||
python ${GITHUB_WORKSPACE}/.github/scripts/cleanup-tags.py --loglevel info --untagged --delete "paperless-ngx/builder/cache/psycopg2"
|
||||
-
|
||||
name: Cleanup for package "builder/cache/jbig2enc"
|
||||
if: "${{ env.TOKEN != '' }}"
|
||||
run: |
|
||||
python ${GITHUB_WORKSPACE}/.github/scripts/cleanup-tags.py --loglevel info --untagged --delete "paperless-ngx/builder/cache/jbig2enc"
|
||||
-
|
||||
name: Cleanup for package "builder/cache/pikepdf"
|
||||
if: "${{ env.TOKEN != '' }}"
|
||||
run: |
|
||||
python ${GITHUB_WORKSPACE}/.github/scripts/cleanup-tags.py --loglevel info --untagged --delete "paperless-ngx/builder/cache/pikepdf"
|
||||
-
|
||||
name: Check all tags still pull
|
||||
run: |
|
||||
ghcr_name=$(echo "${GITHUB_REPOSITORY}" | awk '{ print tolower($0) }')
|
||||
echo "Pulling all tags of ghcr.io/${ghcr_name}"
|
||||
docker pull --quiet --all-tags ghcr.io/${ghcr_name}
|
54
.github/workflows/codeql-analysis.yml
vendored
Normal file
@@ -0,0 +1,54 @@
|
||||
# For most projects, this workflow file will not need changing; you simply need
|
||||
# to commit it to your repository.
|
||||
#
|
||||
# You may wish to alter this file to override the set of languages analyzed,
|
||||
# or to provide custom queries or build logic.
|
||||
#
|
||||
# ******** NOTE ********
|
||||
# We have attempted to detect the languages in your repository. Please check
|
||||
# the `language` matrix defined below to confirm you have the correct set of
|
||||
# supported CodeQL languages.
|
||||
#
|
||||
name: "CodeQL"
|
||||
|
||||
on:
|
||||
push:
|
||||
branches: [ main, dev ]
|
||||
pull_request:
|
||||
# The branches below must be a subset of the branches above
|
||||
branches: [ dev ]
|
||||
schedule:
|
||||
- cron: '28 13 * * 5'
|
||||
|
||||
jobs:
|
||||
analyze:
|
||||
name: Analyze
|
||||
runs-on: ubuntu-latest
|
||||
permissions:
|
||||
actions: read
|
||||
contents: read
|
||||
security-events: write
|
||||
|
||||
strategy:
|
||||
fail-fast: false
|
||||
matrix:
|
||||
language: [ 'javascript', 'python' ]
|
||||
# CodeQL supports [ 'cpp', 'csharp', 'go', 'java', 'javascript', 'python', 'ruby' ]
|
||||
# Learn more about CodeQL language support at https://git.io/codeql-language-support
|
||||
|
||||
steps:
|
||||
- name: Checkout repository
|
||||
uses: actions/checkout@v2
|
||||
|
||||
# Initializes the CodeQL tools for scanning.
|
||||
- name: Initialize CodeQL
|
||||
uses: github/codeql-action/init@v2
|
||||
with:
|
||||
languages: ${{ matrix.language }}
|
||||
# If you wish to specify custom queries, you can do so here or in a config file.
|
||||
# By default, queries listed here will override any specified in a config file.
|
||||
# Prefix the list here with "+" to use these queries and those in the config file.
|
||||
# queries: ./path/to/local/query, your-org/your-repo/queries@main
|
||||
|
||||
- name: Perform CodeQL Analysis
|
||||
uses: github/codeql-action/analyze@v2
|
147
.github/workflows/installer-library.yml
vendored
Normal file
@@ -0,0 +1,147 @@
|
||||
# This workflow will run to update the installer library of
|
||||
# Docker images. These are the images which provide updated wheels
|
||||
# .deb installation packages or maybe just some compiled library
|
||||
|
||||
name: Build Image Library
|
||||
|
||||
on:
|
||||
push:
|
||||
# Must match one of these branches AND one of the paths
|
||||
# to be triggered
|
||||
branches:
|
||||
- "main"
|
||||
- "dev"
|
||||
- "library-*"
|
||||
- "feature-*"
|
||||
paths:
|
||||
# Trigger the workflow if a Dockerfile changed
|
||||
- "docker-builders/**"
|
||||
# Trigger if a package was updated
|
||||
- ".build-config.json"
|
||||
- "Pipfile.lock"
|
||||
# Also trigger on workflow changes related to the library
|
||||
- ".github/workflows/installer-library.yml"
|
||||
- ".github/workflows/reusable-workflow-builder.yml"
|
||||
- ".github/scripts/**"
|
||||
|
||||
# Set a workflow level concurrency group so primary workflow
|
||||
# can wait for this to complete if needed
|
||||
# DO NOT CHANGE without updating main workflow group
|
||||
concurrency:
|
||||
group: build-installer-library
|
||||
cancel-in-progress: false
|
||||
|
||||
jobs:
|
||||
prepare-docker-build:
|
||||
name: Prepare Docker Image Version Data
|
||||
runs-on: ubuntu-20.04
|
||||
steps:
|
||||
-
|
||||
name: Set ghcr repository name
|
||||
id: set-ghcr-repository
|
||||
run: |
|
||||
ghcr_name=$(echo "${GITHUB_REPOSITORY}" | awk '{ print tolower($0) }')
|
||||
echo ::set-output name=repository::${ghcr_name}
|
||||
-
|
||||
name: Checkout
|
||||
uses: actions/checkout@v3
|
||||
-
|
||||
name: Set up Python
|
||||
uses: actions/setup-python@v4
|
||||
with:
|
||||
python-version: "3.9"
|
||||
-
|
||||
name: Setup qpdf image
|
||||
id: qpdf-setup
|
||||
run: |
|
||||
build_json=$(python ${GITHUB_WORKSPACE}/.github/scripts/get-build-json.py qpdf)
|
||||
|
||||
echo ${build_json}
|
||||
|
||||
echo ::set-output name=qpdf-json::${build_json}
|
||||
-
|
||||
name: Setup psycopg2 image
|
||||
id: psycopg2-setup
|
||||
run: |
|
||||
build_json=$(python ${GITHUB_WORKSPACE}/.github/scripts/get-build-json.py psycopg2)
|
||||
|
||||
echo ${build_json}
|
||||
|
||||
echo ::set-output name=psycopg2-json::${build_json}
|
||||
-
|
||||
name: Setup pikepdf image
|
||||
id: pikepdf-setup
|
||||
run: |
|
||||
build_json=$(python ${GITHUB_WORKSPACE}/.github/scripts/get-build-json.py pikepdf)
|
||||
|
||||
echo ${build_json}
|
||||
|
||||
echo ::set-output name=pikepdf-json::${build_json}
|
||||
-
|
||||
name: Setup jbig2enc image
|
||||
id: jbig2enc-setup
|
||||
run: |
|
||||
build_json=$(python ${GITHUB_WORKSPACE}/.github/scripts/get-build-json.py jbig2enc)
|
||||
|
||||
echo ${build_json}
|
||||
|
||||
echo ::set-output name=jbig2enc-json::${build_json}
|
||||
|
||||
outputs:
|
||||
|
||||
ghcr-repository: ${{ steps.set-ghcr-repository.outputs.repository }}
|
||||
|
||||
qpdf-json: ${{ steps.qpdf-setup.outputs.qpdf-json }}
|
||||
|
||||
pikepdf-json: ${{ steps.pikepdf-setup.outputs.pikepdf-json }}
|
||||
|
||||
psycopg2-json: ${{ steps.psycopg2-setup.outputs.psycopg2-json }}
|
||||
|
||||
jbig2enc-json: ${{ steps.jbig2enc-setup.outputs.jbig2enc-json}}
|
||||
|
||||
build-qpdf-debs:
|
||||
name: qpdf
|
||||
needs:
|
||||
- prepare-docker-build
|
||||
uses: ./.github/workflows/reusable-workflow-builder.yml
|
||||
with:
|
||||
dockerfile: ./docker-builders/Dockerfile.qpdf
|
||||
build-json: ${{ needs.prepare-docker-build.outputs.qpdf-json }}
|
||||
build-args: |
|
||||
QPDF_VERSION=${{ fromJSON(needs.prepare-docker-build.outputs.qpdf-json).version }}
|
||||
|
||||
build-jbig2enc:
|
||||
name: jbig2enc
|
||||
needs:
|
||||
- prepare-docker-build
|
||||
uses: ./.github/workflows/reusable-workflow-builder.yml
|
||||
with:
|
||||
dockerfile: ./docker-builders/Dockerfile.jbig2enc
|
||||
build-json: ${{ needs.prepare-docker-build.outputs.jbig2enc-json }}
|
||||
build-args: |
|
||||
JBIG2ENC_VERSION=${{ fromJSON(needs.prepare-docker-build.outputs.jbig2enc-json).version }}
|
||||
|
||||
build-psycopg2-wheel:
|
||||
name: psycopg2
|
||||
needs:
|
||||
- prepare-docker-build
|
||||
uses: ./.github/workflows/reusable-workflow-builder.yml
|
||||
with:
|
||||
dockerfile: ./docker-builders/Dockerfile.psycopg2
|
||||
build-json: ${{ needs.prepare-docker-build.outputs.psycopg2-json }}
|
||||
build-args: |
|
||||
PSYCOPG2_VERSION=${{ fromJSON(needs.prepare-docker-build.outputs.psycopg2-json).version }}
|
||||
|
||||
build-pikepdf-wheel:
|
||||
name: pikepdf
|
||||
needs:
|
||||
- prepare-docker-build
|
||||
- build-qpdf-debs
|
||||
uses: ./.github/workflows/reusable-workflow-builder.yml
|
||||
with:
|
||||
dockerfile: ./docker-builders/Dockerfile.pikepdf
|
||||
build-json: ${{ needs.prepare-docker-build.outputs.pikepdf-json }}
|
||||
build-args: |
|
||||
REPO=${{ needs.prepare-docker-build.outputs.ghcr-repository }}
|
||||
QPDF_VERSION=${{ fromJSON(needs.prepare-docker-build.outputs.qpdf-json).version }}
|
||||
PIKEPDF_VERSION=${{ fromJSON(needs.prepare-docker-build.outputs.pikepdf-json).version }}
|
57
.github/workflows/project-actions.yml
vendored
Normal file
@@ -0,0 +1,57 @@
|
||||
name: Project Automations
|
||||
|
||||
on:
|
||||
issues:
|
||||
types:
|
||||
- opened
|
||||
- reopened
|
||||
pull_request_target: #_target allows access to secrets
|
||||
types:
|
||||
- opened
|
||||
- reopened
|
||||
branches:
|
||||
- main
|
||||
- dev
|
||||
|
||||
permissions:
|
||||
contents: read
|
||||
|
||||
env:
|
||||
todo: Todo
|
||||
done: Done
|
||||
in_progress: In Progress
|
||||
|
||||
jobs:
|
||||
issue_opened_or_reopened:
|
||||
name: issue_opened_or_reopened
|
||||
runs-on: ubuntu-latest
|
||||
if: github.event_name == 'issues' && (github.event.action == 'opened' || github.event.action == 'reopened')
|
||||
steps:
|
||||
- name: Add issue to project and set status to ${{ env.todo }}
|
||||
uses: leonsteinhaeuser/project-beta-automations@v1.3.0
|
||||
with:
|
||||
gh_token: ${{ secrets.GH_TOKEN }}
|
||||
organization: paperless-ngx
|
||||
project_id: 2
|
||||
resource_node_id: ${{ github.event.issue.node_id }}
|
||||
status_value: ${{ env.todo }} # Target status
|
||||
pr_opened_or_reopened:
|
||||
name: pr_opened_or_reopened
|
||||
runs-on: ubuntu-latest
|
||||
permissions:
|
||||
# write permission is required for autolabeler
|
||||
pull-requests: write
|
||||
if: github.event_name == 'pull_request_target' && (github.event.action == 'opened' || github.event.action == 'reopened') && github.event.pull_request.user.login != 'dependabot'
|
||||
steps:
|
||||
- name: Add PR to project and set status to "Needs Review"
|
||||
uses: leonsteinhaeuser/project-beta-automations@v1.3.0
|
||||
with:
|
||||
gh_token: ${{ secrets.GH_TOKEN }}
|
||||
organization: paperless-ngx
|
||||
project_id: 2
|
||||
resource_node_id: ${{ github.event.pull_request.node_id }}
|
||||
status_value: "Needs Review" # Target status
|
||||
- name: Label PR with release-drafter
|
||||
uses: release-drafter/release-drafter@v5
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
53
.github/workflows/reusable-workflow-builder.yml
vendored
Normal file
@@ -0,0 +1,53 @@
|
||||
name: Reusable Image Builder
|
||||
|
||||
on:
|
||||
workflow_call:
|
||||
inputs:
|
||||
dockerfile:
|
||||
required: true
|
||||
type: string
|
||||
build-json:
|
||||
required: true
|
||||
type: string
|
||||
build-args:
|
||||
required: false
|
||||
default: ""
|
||||
type: string
|
||||
|
||||
concurrency:
|
||||
group: ${{ github.workflow }}-${{ fromJSON(inputs.build-json).name }}-${{ fromJSON(inputs.build-json).version }}
|
||||
cancel-in-progress: false
|
||||
|
||||
jobs:
|
||||
build-image:
|
||||
name: Build ${{ fromJSON(inputs.build-json).name }} @ ${{ fromJSON(inputs.build-json).version }}
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
-
|
||||
name: Checkout
|
||||
uses: actions/checkout@v3
|
||||
-
|
||||
name: Login to Github Container Registry
|
||||
uses: docker/login-action@v2
|
||||
with:
|
||||
registry: ghcr.io
|
||||
username: ${{ github.actor }}
|
||||
password: ${{ secrets.GITHUB_TOKEN }}
|
||||
-
|
||||
name: Set up Docker Buildx
|
||||
uses: docker/setup-buildx-action@v2
|
||||
-
|
||||
name: Set up QEMU
|
||||
uses: docker/setup-qemu-action@v2
|
||||
-
|
||||
name: Build ${{ fromJSON(inputs.build-json).name }}
|
||||
uses: docker/build-push-action@v3
|
||||
with:
|
||||
context: .
|
||||
file: ${{ inputs.dockerfile }}
|
||||
tags: ${{ fromJSON(inputs.build-json).image_tag }}
|
||||
platforms: linux/amd64,linux/arm64,linux/arm/v7
|
||||
build-args: ${{ inputs.build-args }}
|
||||
push: true
|
||||
cache-from: type=registry,ref=${{ fromJSON(inputs.build-json).cache_tag }}
|
||||
cache-to: type=registry,mode=max,ref=${{ fromJSON(inputs.build-json).cache_tag }}
|
11
.gitignore
vendored
@@ -61,10 +61,16 @@ target/
|
||||
# PyCharm
|
||||
.idea
|
||||
|
||||
# VS Code
|
||||
.vscode
|
||||
/src-ui/.vscode
|
||||
/docs/.vscode
|
||||
|
||||
# Other stuff that doesn't belong
|
||||
.virtualenv
|
||||
virtualenv
|
||||
/venv
|
||||
.venv/
|
||||
/docker-compose.env
|
||||
/docker-compose.yml
|
||||
|
||||
@@ -81,8 +87,9 @@ scripts/nuke
|
||||
/paperless.conf
|
||||
/consume/
|
||||
/export/
|
||||
/src-ui/.vscode
|
||||
|
||||
# this is where the compiled frontend is moved to.
|
||||
/src/documents/static/frontend/
|
||||
/docs/.vscode/settings.json
|
||||
|
||||
# mac os
|
||||
.DS_Store
|
||||
|
8
.hadolint.yml
Normal file
@@ -0,0 +1,8 @@
|
||||
failure-threshold: warning
|
||||
ignored:
|
||||
# https://github.com/hadolint/hadolint/wiki/DL3008
|
||||
- DL3008
|
||||
# https://github.com/hadolint/hadolint/wiki/DL3013
|
||||
- DL3013
|
||||
# https://github.com/hadolint/hadolint/wiki/DL3003
|
||||
- DL3003
|
87
.pre-commit-config.yaml
Normal file
@@ -0,0 +1,87 @@
|
||||
# This file configures pre-commit hooks.
|
||||
# See https://pre-commit.com/ for general information
|
||||
# See https://pre-commit.com/hooks.html for a listing of possible hooks
|
||||
|
||||
repos:
|
||||
# General hooks
|
||||
- repo: https://github.com/pre-commit/pre-commit-hooks
|
||||
rev: v4.3.0
|
||||
hooks:
|
||||
- id: check-docstring-first
|
||||
- id: check-json
|
||||
exclude: "tsconfig.*json"
|
||||
- id: check-yaml
|
||||
- id: check-toml
|
||||
- id: check-executables-have-shebangs
|
||||
- id: end-of-file-fixer
|
||||
exclude_types:
|
||||
- svg
|
||||
- pofile
|
||||
exclude: "(^LICENSE$)"
|
||||
- id: mixed-line-ending
|
||||
args:
|
||||
- "--fix=lf"
|
||||
- id: trailing-whitespace
|
||||
exclude_types:
|
||||
- svg
|
||||
- id: check-case-conflict
|
||||
- id: detect-private-key
|
||||
- repo: https://github.com/pre-commit/mirrors-prettier
|
||||
rev: "v2.7.1"
|
||||
hooks:
|
||||
- id: prettier
|
||||
types_or:
|
||||
- javascript
|
||||
- ts
|
||||
- markdown
|
||||
exclude: "(^Pipfile\\.lock$)"
|
||||
# Python hooks
|
||||
- repo: https://github.com/asottile/reorder_python_imports
|
||||
rev: v3.8.2
|
||||
hooks:
|
||||
- id: reorder-python-imports
|
||||
exclude: "(migrations)"
|
||||
- repo: https://github.com/asottile/yesqa
|
||||
rev: "v1.4.0"
|
||||
hooks:
|
||||
- id: yesqa
|
||||
exclude: "(migrations)"
|
||||
- repo: https://github.com/asottile/add-trailing-comma
|
||||
rev: "v2.2.3"
|
||||
hooks:
|
||||
- id: add-trailing-comma
|
||||
exclude: "(migrations)"
|
||||
- repo: https://gitlab.com/pycqa/flake8
|
||||
rev: 3.9.2
|
||||
hooks:
|
||||
- id: flake8
|
||||
files: ^src/
|
||||
args:
|
||||
- "--config=./src/setup.cfg"
|
||||
- repo: https://github.com/psf/black
|
||||
rev: 22.6.0
|
||||
hooks:
|
||||
- id: black
|
||||
- repo: https://github.com/asottile/pyupgrade
|
||||
rev: v2.37.3
|
||||
hooks:
|
||||
- id: pyupgrade
|
||||
exclude: "(migrations)"
|
||||
args:
|
||||
- "--py38-plus"
|
||||
# Dockerfile hooks
|
||||
- repo: https://github.com/AleksaC/hadolint-py
|
||||
rev: v2.10.0
|
||||
hooks:
|
||||
- id: hadolint
|
||||
# Shell script hooks
|
||||
- repo: https://github.com/lovesegfault/beautysh
|
||||
rev: v6.2.1
|
||||
hooks:
|
||||
- id: beautysh
|
||||
args:
|
||||
- "--tab"
|
||||
- repo: https://github.com/shellcheck-py/shellcheck-py
|
||||
rev: "v0.8.0.4"
|
||||
hooks:
|
||||
- id: shellcheck
|
4
.prettierrc
Normal file
@@ -0,0 +1,4 @@
|
||||
# https://prettier.io/docs/en/options.html#semicolons
|
||||
semi: false
|
||||
# https://prettier.io/docs/en/options.html#quotes
|
||||
singleQuote: true
|
@@ -11,6 +11,6 @@ sphinx:
|
||||
|
||||
# Optionally set the version of Python and requirements required to build your docs
|
||||
python:
|
||||
version: 3.7
|
||||
version: "3.8"
|
||||
install:
|
||||
- requirements: docs/requirements.txt
|
||||
|
9
CODEOWNERS
Normal file
@@ -0,0 +1,9 @@
|
||||
/.github/workflows/ @paperless-ngx/ci-cd
|
||||
/docker/ @paperless-ngx/ci-cd
|
||||
/scripts/ @paperless-ngx/ci-cd
|
||||
|
||||
/src-ui/ @paperless-ngx/frontend
|
||||
|
||||
/src/ @paperless-ngx/backend
|
||||
Pipfile* @paperless-ngx/backend
|
||||
*.py @paperless-ngx/backend
|
128
CODE_OF_CONDUCT.md
Normal file
@@ -0,0 +1,128 @@
|
||||
# Contributor Covenant Code of Conduct
|
||||
|
||||
## Our Pledge
|
||||
|
||||
We as members, contributors, and leaders pledge to make participation in our
|
||||
community a harassment-free experience for everyone, regardless of age, body
|
||||
size, visible or invisible disability, ethnicity, sex characteristics, gender
|
||||
identity and expression, level of experience, education, socio-economic status,
|
||||
nationality, personal appearance, race, religion, or sexual identity
|
||||
and orientation.
|
||||
|
||||
We pledge to act and interact in ways that contribute to an open, welcoming,
|
||||
diverse, inclusive, and healthy community.
|
||||
|
||||
## Our Standards
|
||||
|
||||
Examples of behavior that contributes to a positive environment for our
|
||||
community include:
|
||||
|
||||
- Demonstrating empathy and kindness toward other people
|
||||
- Being respectful of differing opinions, viewpoints, and experiences
|
||||
- Giving and gracefully accepting constructive feedback
|
||||
- Accepting responsibility and apologizing to those affected by our mistakes,
|
||||
and learning from the experience
|
||||
- Focusing on what is best not just for us as individuals, but for the
|
||||
overall community
|
||||
|
||||
Examples of unacceptable behavior include:
|
||||
|
||||
- The use of sexualized language or imagery, and sexual attention or
|
||||
advances of any kind
|
||||
- Trolling, insulting or derogatory comments, and personal or political attacks
|
||||
- Public or private harassment
|
||||
- Publishing others' private information, such as a physical or email
|
||||
address, without their explicit permission
|
||||
- Other conduct which could reasonably be considered inappropriate in a
|
||||
professional setting
|
||||
|
||||
## Enforcement Responsibilities
|
||||
|
||||
Community leaders are responsible for clarifying and enforcing our standards of
|
||||
acceptable behavior and will take appropriate and fair corrective action in
|
||||
response to any behavior that they deem inappropriate, threatening, offensive,
|
||||
or harmful.
|
||||
|
||||
Community leaders have the right and responsibility to remove, edit, or reject
|
||||
comments, commits, code, wiki edits, issues, and other contributions that are
|
||||
not aligned to this Code of Conduct, and will communicate reasons for moderation
|
||||
decisions when appropriate.
|
||||
|
||||
## Scope
|
||||
|
||||
This Code of Conduct applies within all community spaces, and also applies when
|
||||
an individual is officially representing the community in public spaces.
|
||||
Examples of representing our community include using an official e-mail address,
|
||||
posting via an official social media account, or acting as an appointed
|
||||
representative at an online or offline event.
|
||||
|
||||
## Enforcement
|
||||
|
||||
Instances of abusive, harassing, or otherwise unacceptable behavior may be
|
||||
reported to the community leaders responsible for enforcement at
|
||||
hello@paperless-ngx.com.
|
||||
All complaints will be reviewed and investigated promptly and fairly.
|
||||
|
||||
All community leaders are obligated to respect the privacy and security of the
|
||||
reporter of any incident.
|
||||
|
||||
## Enforcement Guidelines
|
||||
|
||||
Community leaders will follow these Community Impact Guidelines in determining
|
||||
the consequences for any action they deem in violation of this Code of Conduct:
|
||||
|
||||
### 1. Correction
|
||||
|
||||
**Community Impact**: Use of inappropriate language or other behavior deemed
|
||||
unprofessional or unwelcome in the community.
|
||||
|
||||
**Consequence**: A private, written warning from community leaders, providing
|
||||
clarity around the nature of the violation and an explanation of why the
|
||||
behavior was inappropriate. A public apology may be requested.
|
||||
|
||||
### 2. Warning
|
||||
|
||||
**Community Impact**: A violation through a single incident or series
|
||||
of actions.
|
||||
|
||||
**Consequence**: A warning with consequences for continued behavior. No
|
||||
interaction with the people involved, including unsolicited interaction with
|
||||
those enforcing the Code of Conduct, for a specified period of time. This
|
||||
includes avoiding interactions in community spaces as well as external channels
|
||||
like social media. Violating these terms may lead to a temporary or
|
||||
permanent ban.
|
||||
|
||||
### 3. Temporary Ban
|
||||
|
||||
**Community Impact**: A serious violation of community standards, including
|
||||
sustained inappropriate behavior.
|
||||
|
||||
**Consequence**: A temporary ban from any sort of interaction or public
|
||||
communication with the community for a specified period of time. No public or
|
||||
private interaction with the people involved, including unsolicited interaction
|
||||
with those enforcing the Code of Conduct, is allowed during this period.
|
||||
Violating these terms may lead to a permanent ban.
|
||||
|
||||
### 4. Permanent Ban
|
||||
|
||||
**Community Impact**: Demonstrating a pattern of violation of community
|
||||
standards, including sustained inappropriate behavior, harassment of an
|
||||
individual, or aggression toward or disparagement of classes of individuals.
|
||||
|
||||
**Consequence**: A permanent ban from any sort of public interaction within
|
||||
the community.
|
||||
|
||||
## Attribution
|
||||
|
||||
This Code of Conduct is adapted from the [Contributor Covenant][homepage],
|
||||
version 2.0, available at
|
||||
https://www.contributor-covenant.org/version/2/0/code_of_conduct.html.
|
||||
|
||||
Community Impact Guidelines were inspired by [Mozilla's code of conduct
|
||||
enforcement ladder](https://github.com/mozilla/diversity).
|
||||
|
||||
[homepage]: https://www.contributor-covenant.org
|
||||
|
||||
For answers to common questions about this code of conduct, see the FAQ at
|
||||
https://www.contributor-covenant.org/faq. Translations are available at
|
||||
https://www.contributor-covenant.org/translations.
|
124
CONTRIBUTING.md
@@ -1,30 +1,132 @@
|
||||
# Contributing
|
||||
|
||||
There's still lots of things to be done, just have a look at that issue log. If you feel like conctributing to the project, please do! Bug fixes and improvements to the front end (I just can't seem to get some of these CSS things right) are always welcome.
|
||||
If you feel like contributing to the project, please do! Bug fixes and improvements are always welcome.
|
||||
|
||||
If you want to implement something big: Please start a discussion about that in the issues! Maybe I've already had something similar in mind and we can make it happen together. However, keep in mind that the general roadmap is to make the existing features stable and get them tested. See the roadmap in the readme.
|
||||
If you want to implement something big:
|
||||
|
||||
* When making additions to the project, consider if the majority of users will benefit from your change. If not, you're probably better of forking the project.
|
||||
* Also consider if your change will get in the way of other users. A good change is a change that enhances the experience of some users who want that change and does not affect users who do not care about the change.
|
||||
- Please start a discussion about that in the issues! Maybe something similar is already in development and we can make it happen together.
|
||||
- When making additions to the project, consider if the majority of users will benefit from your change. If not, you're probably better of forking the project.
|
||||
- Also consider if your change will get in the way of other users. A good change is a change that enhances the experience of some users who want that change and does not affect users who do not care about the change.
|
||||
- Please see the [paperless-ngx merge process](#merging-prs) below.
|
||||
|
||||
## Python
|
||||
|
||||
Paperless supports python 3.6, 3.7, 3.8 and 3.9.
|
||||
Paperless supports python 3.8 and 3.9. We format Python code with [Black](https://github.com/psf/black).
|
||||
|
||||
## Branches
|
||||
|
||||
master always reflects the latest release. Apart from changes to the documentation or readme, absolutely no functional changes on this branch in between releases.
|
||||
`main` always reflects the latest release. Apart from changes to the documentation or readme, absolutely no functional changes on this branch in between releases.
|
||||
|
||||
dev contains all changes that will be part of the next release. Use this branch to start making your changes.
|
||||
`dev` contains all changes that will be part of the next release. Use this branch to start making your changes.
|
||||
|
||||
feature-X branches is for experimental stuff that will eventually be merged into dev, and then released as part of the next release.
|
||||
`feature-X` branches are for experimental stuff that will eventually be merged into dev.
|
||||
|
||||
## Testing:
|
||||
|
||||
I'm trying to get most of paperless tested, so please do the same for your code! I know its a hassle, but it makes sure that your code works now and will allow us to detect regressions easily.
|
||||
Please format and test your code! I know it's a hassle, but it makes sure that your code works now and will allow us to detect regressions easily.
|
||||
|
||||
To test your code, execute `pytest` in the src/ directory. Executing that in the project root is no good. This also generates a html coverage report, which you can use to see if you missed anything important during testing.
|
||||
To test your code, execute `pytest` in the src/ directory. This also generates a html coverage report, which you can use to see if you missed anything important during testing.
|
||||
|
||||
Before you can run `pytest`, ensure to [properly set up your local environment](https://paperless-ngx.readthedocs.io/en/latest/extending.html#initial-setup-and-first-start).
|
||||
|
||||
## More info:
|
||||
|
||||
... is available in the documentation. https://paperless-ng.readthedocs.io/en/latest/extending.html
|
||||
... is available in the documentation. https://paperless-ngx.readthedocs.io/en/latest/extending.html
|
||||
|
||||
# Merging PRs
|
||||
|
||||
Once you have submitted a **P**ull **R**equest it will be reviewed, approved, and merged by one or more community members of any team. Automated code tests and formatting checks must be passed.
|
||||
|
||||
## Non-Trivial Requests
|
||||
|
||||
PRs deemed `non-trivial` will go through a stricter review process before being merged into `dev`. This is to ensure code quality and complete functionality (free of side effects).
|
||||
|
||||
Examples of `non-trivial` PRs might include:
|
||||
|
||||
- Additional features
|
||||
- Large changes to many distinct files
|
||||
- Breaking or depreciation of existing features
|
||||
|
||||
Our community review process for `non-trivial` PRs is the following:
|
||||
|
||||
1. Must pass usual automated code tests and formatting checks.
|
||||
2. The PR will be assigned and pinged to the appropriately experienced team (i.e. @paperless-ngx/backend for backend changes).
|
||||
3. Development team will check and test code manually (possibly over several days).
|
||||
- You may be asked to make changes or rebase.
|
||||
- The team may ask for additional testing done by @paperless-ngx/test
|
||||
4. **At least two** members of the team will approve and finally merge the request into `dev` 🎉.
|
||||
|
||||
This process might be slow as community members have different schedules and time to dedicate to the Paperless project. However it ensures community code reviews are as brilliantly thorough as they once were with @jonaswinkler.
|
||||
|
||||
# Translating Paperless-ngx
|
||||
|
||||
Some notes about translation:
|
||||
|
||||
- There are two resources:
|
||||
- `src-ui/messages.xlf` contains the translation strings for the front end. This is the most important.
|
||||
- `django.po` contains strings for the administration section of paperless, which is nice to have translated.
|
||||
- Most of the front-end strings are used on buttons, menu items, etc., so ideally the translated string should not be much longer than the English original.
|
||||
- Translation units may contain placeholders. These usually mean that there's a name of a tag or document or something in the string. You can click on the placeholders to copy them.
|
||||
- Translation units may contain plural expressions such as `{PLURAL_VAR, plural, =1 {one result} =0 {no results} other {<placeholder> results}}`. Copy these verbatim and translate only the content in the inner `{}` brackets. Example: `{PLURAL_VAR, plural, =1 {Ein Ergebnis} =0 {Keine Ergebnisse} other {<placeholder> Ergebnisse}}`
|
||||
- Changes to translations on Crowdin will get pushed into the repository automatically.
|
||||
|
||||
## Adding new languages to the codebase
|
||||
|
||||
If a language has already been added, and you would like to contribute new translations or change existing translations, please read the "Translation" section in the README.md file for further details on that.
|
||||
|
||||
If you would like the project to be translated to another language, first head over to https://crwd.in/paperless-ngx to check if that language has already been enabled for translation.
|
||||
If not, please request the language to be added by creating an issue on GitHub. The issue should contain:
|
||||
|
||||
- English name of the language (the localized name can be added on Crowdin).
|
||||
- ISO language code. A list of those can be found here: https://support.crowdin.com/enterprise/language-codes/
|
||||
- Date format commonly used for the language, e.g. dd/mm/yyyy, mm/dd/yyyy, etc.
|
||||
|
||||
After the language has been added and some translations have been made on Crowdin, the language needs to be enabled in the code.
|
||||
Note that there is no need to manually add a .po of .xlf file as those will be automatically generated and imported from Crowdin.
|
||||
The following files need to be changed:
|
||||
|
||||
- src-ui/angular.json (under the _projects/paperless-ui/i18n/locales_ JSON key)
|
||||
- src/paperless/settings.py (in the _LANGUAGES_ array)
|
||||
- src-ui/src/app/services/settings.service.ts (inside the _getLanguageOptions_ method)
|
||||
- src-ui/src/app/app.module.ts (import locale from _angular/common/locales_ and call _registerLocaleData_)
|
||||
|
||||
Please add the language in the correct order, alphabetically by locale.
|
||||
Note that _en-us_ needs to stay on top of the list, as it is the default project language
|
||||
|
||||
If you are familiar with Git, feel free to send a Pull Request with those changes.
|
||||
If not, let us know in the issue you created for the language, so that another developer can make these changes.
|
||||
|
||||
# Organization Structure & Membership
|
||||
|
||||
Paperless-ngx is a community project. We do our best to delegate permission and responsibility among a team of people to ensure the longevity of the project.
|
||||
|
||||
## Structure
|
||||
|
||||
As of writing, there are 21 members in paperless-ngx. 4 of these people have complete administrative privileges to the repo:
|
||||
|
||||
- [@shamoon](https://github.com/shamoon)
|
||||
- [@bauerj](https://github.com/bauerj)
|
||||
- [@qcasey](https://github.com/qcasey)
|
||||
- [@FrankStrieter](https://github.com/FrankStrieter)
|
||||
|
||||
There are 5 teams collaborating on specific tasks within paperless-ngx:
|
||||
|
||||
- @paperless-ngx/backend (Python / django)
|
||||
- @paperless-ngx/frontend (JavaScript / Typescript)
|
||||
- @paperless-ngx/ci-cd (GitHub Actions / Deployment)
|
||||
- @paperless-ngx/issues (Issue triage)
|
||||
- @paperless-ngx/test (General testing for larger PRs)
|
||||
|
||||
## Permissions
|
||||
|
||||
All team members are notified when mentioned or assigned to a relevant issue or pull request. Additionally, each team has slightly different access to paperless-ngx:
|
||||
|
||||
- The **test** team has no special permissions.
|
||||
- The **issues** team has `triage` access. This means they can organize issues and pull requests.
|
||||
- The **backend**, **frontend**, and **ci-cd** teams have `write` access. This means they can approve PRs and push code, containers, releases, and more.
|
||||
|
||||
## Joining
|
||||
|
||||
We are not overly strict with inviting people to the organization. If you have read the [team permissions](#permissions) and think having additional access would enhance your contributions, please reach out to an [admin](#structure) of the team.
|
||||
|
||||
The admins occasionally invite contributors directly if we believe having them on a team will accelerate their work.
|
||||
|
309
Dockerfile
@@ -1,104 +1,245 @@
|
||||
FROM ubuntu:20.04 AS jbig2enc
|
||||
# syntax=docker/dockerfile:1.4
|
||||
|
||||
WORKDIR /usr/src/jbig2enc
|
||||
# Pull the installer images from the library
|
||||
# These are all built previously
|
||||
# They provide either a .deb or .whl
|
||||
|
||||
RUN apt-get update && apt-get install -y --no-install-recommends build-essential automake libtool libleptonica-dev zlib1g-dev git ca-certificates
|
||||
ARG JBIG2ENC_VERSION
|
||||
ARG QPDF_VERSION
|
||||
ARG PIKEPDF_VERSION
|
||||
ARG PSYCOPG2_VERSION
|
||||
|
||||
RUN git clone https://github.com/agl/jbig2enc .
|
||||
RUN ./autogen.sh
|
||||
RUN ./configure && make
|
||||
FROM ghcr.io/paperless-ngx/paperless-ngx/builder/jbig2enc:${JBIG2ENC_VERSION} as jbig2enc-builder
|
||||
FROM ghcr.io/paperless-ngx/paperless-ngx/builder/qpdf:${QPDF_VERSION} as qpdf-builder
|
||||
FROM ghcr.io/paperless-ngx/paperless-ngx/builder/pikepdf:${PIKEPDF_VERSION} as pikepdf-builder
|
||||
FROM ghcr.io/paperless-ngx/paperless-ngx/builder/psycopg2:${PSYCOPG2_VERSION} as psycopg2-builder
|
||||
|
||||
FROM python:3.9-slim-bullseye
|
||||
FROM --platform=$BUILDPLATFORM node:16-bullseye-slim AS compile-frontend
|
||||
|
||||
# Binary dependencies
|
||||
RUN apt-get update \
|
||||
&& apt-get -y --no-install-recommends install \
|
||||
# Basic dependencies
|
||||
curl \
|
||||
gnupg \
|
||||
imagemagick \
|
||||
gettext \
|
||||
tzdata \
|
||||
gosu \
|
||||
# fonts for text file thumbnail generation
|
||||
fonts-liberation \
|
||||
# for Numpy
|
||||
libatlas-base-dev \
|
||||
libxslt1-dev \
|
||||
# thumbnail size reduction
|
||||
optipng \
|
||||
libxml2 \
|
||||
pngquant \
|
||||
unpaper \
|
||||
zlib1g \
|
||||
ghostscript \
|
||||
icc-profiles-free \
|
||||
# Mime type detection
|
||||
file \
|
||||
libmagic-dev \
|
||||
media-types \
|
||||
# OCRmyPDF dependencies
|
||||
liblept5 \
|
||||
qpdf \
|
||||
tesseract-ocr \
|
||||
tesseract-ocr-eng \
|
||||
tesseract-ocr-deu \
|
||||
tesseract-ocr-fra \
|
||||
tesseract-ocr-ita \
|
||||
tesseract-ocr-spa \
|
||||
&& rm -rf /var/lib/apt/lists/*
|
||||
# This stage compiles the frontend
|
||||
# This stage runs once for the native platform, as the outputs are not
|
||||
# dependent on target arch
|
||||
# Inputs: None
|
||||
|
||||
COPY ./src-ui /src/src-ui
|
||||
|
||||
WORKDIR /src/src-ui
|
||||
RUN set -eux \
|
||||
&& npm update npm -g \
|
||||
&& npm ci --omit=optional
|
||||
RUN set -eux \
|
||||
&& ./node_modules/.bin/ng build --configuration production
|
||||
|
||||
FROM python:3.9-slim-bullseye as main-app
|
||||
|
||||
LABEL org.opencontainers.image.authors="paperless-ngx team <hello@paperless-ngx.com>"
|
||||
LABEL org.opencontainers.image.documentation="https://paperless-ngx.readthedocs.io/en/latest/"
|
||||
LABEL org.opencontainers.image.source="https://github.com/paperless-ngx/paperless-ngx"
|
||||
LABEL org.opencontainers.image.url="https://github.com/paperless-ngx/paperless-ngx"
|
||||
LABEL org.opencontainers.image.licenses="GPL-3.0-only"
|
||||
|
||||
ARG DEBIAN_FRONTEND=noninteractive
|
||||
|
||||
#
|
||||
# Begin installation and configuration
|
||||
# Order the steps below from least often changed to most
|
||||
#
|
||||
|
||||
# copy jbig2enc
|
||||
COPY --from=jbig2enc /usr/src/jbig2enc/src/.libs/libjbig2enc* /usr/local/lib/
|
||||
COPY --from=jbig2enc /usr/src/jbig2enc/src/jbig2 /usr/local/bin/
|
||||
COPY --from=jbig2enc /usr/src/jbig2enc/src/*.h /usr/local/include/
|
||||
# Basically will never change again
|
||||
COPY --from=jbig2enc-builder /usr/src/jbig2enc/src/.libs/libjbig2enc* /usr/local/lib/
|
||||
COPY --from=jbig2enc-builder /usr/src/jbig2enc/src/jbig2 /usr/local/bin/
|
||||
COPY --from=jbig2enc-builder /usr/src/jbig2enc/src/*.h /usr/local/include/
|
||||
|
||||
# Packages need for running
|
||||
ARG RUNTIME_PACKAGES="\
|
||||
curl \
|
||||
file \
|
||||
# fonts for text file thumbnail generation
|
||||
fonts-liberation \
|
||||
gettext \
|
||||
ghostscript \
|
||||
gnupg \
|
||||
gosu \
|
||||
icc-profiles-free \
|
||||
imagemagick \
|
||||
media-types \
|
||||
liblept5 \
|
||||
libpq5 \
|
||||
libxml2 \
|
||||
liblcms2-2 \
|
||||
libtiff5 \
|
||||
libxslt1.1 \
|
||||
libfreetype6 \
|
||||
libwebp6 \
|
||||
libopenjp2-7 \
|
||||
libimagequant0 \
|
||||
libraqm0 \
|
||||
libgnutls30 \
|
||||
libjpeg62-turbo \
|
||||
python3 \
|
||||
python3-pip \
|
||||
python3-setuptools \
|
||||
postgresql-client \
|
||||
mariadb-client \
|
||||
# For Numpy
|
||||
libatlas3-base \
|
||||
# OCRmyPDF dependencies
|
||||
tesseract-ocr \
|
||||
tesseract-ocr-eng \
|
||||
tesseract-ocr-deu \
|
||||
tesseract-ocr-fra \
|
||||
tesseract-ocr-ita \
|
||||
tesseract-ocr-spa \
|
||||
# Suggested for OCRmyPDF
|
||||
pngquant \
|
||||
# Suggested for pikepdf
|
||||
jbig2dec \
|
||||
tzdata \
|
||||
unpaper \
|
||||
# Mime type detection
|
||||
zlib1g \
|
||||
# Barcode splitter
|
||||
libzbar0 \
|
||||
poppler-utils"
|
||||
|
||||
# Install basic runtime packages.
|
||||
# These change very infrequently
|
||||
RUN set -eux \
|
||||
echo "Installing system packages" \
|
||||
&& apt-get update \
|
||||
&& apt-get install --yes --quiet --no-install-recommends ${RUNTIME_PACKAGES} \
|
||||
&& rm -rf /var/lib/apt/lists/* \
|
||||
&& echo "Installing supervisor" \
|
||||
&& python3 -m pip install --default-timeout=1000 --upgrade --no-cache-dir supervisor==4.2.4
|
||||
|
||||
# Copy gunicorn config
|
||||
# Changes very infrequently
|
||||
WORKDIR /usr/src/paperless/
|
||||
|
||||
COPY gunicorn.conf.py .
|
||||
|
||||
# setup docker-specific things
|
||||
# Use mounts to avoid copying installer files into the image
|
||||
# These change sometimes, but rarely
|
||||
WORKDIR /usr/src/paperless/src/docker/
|
||||
|
||||
COPY [ \
|
||||
"docker/imagemagick-policy.xml", \
|
||||
"docker/supervisord.conf", \
|
||||
"docker/docker-entrypoint.sh", \
|
||||
"docker/docker-prepare.sh", \
|
||||
"docker/paperless_cmd.sh", \
|
||||
"docker/wait-for-redis.py", \
|
||||
"docker/management_script.sh", \
|
||||
"docker/install_management_commands.sh", \
|
||||
"/usr/src/paperless/src/docker/" \
|
||||
]
|
||||
|
||||
RUN set -eux \
|
||||
&& echo "Configuring ImageMagick" \
|
||||
&& mv imagemagick-policy.xml /etc/ImageMagick-6/policy.xml \
|
||||
&& echo "Configuring supervisord" \
|
||||
&& mkdir /var/log/supervisord /var/run/supervisord \
|
||||
&& mv supervisord.conf /etc/supervisord.conf \
|
||||
&& echo "Setting up Docker scripts" \
|
||||
&& mv docker-entrypoint.sh /sbin/docker-entrypoint.sh \
|
||||
&& chmod 755 /sbin/docker-entrypoint.sh \
|
||||
&& mv docker-prepare.sh /sbin/docker-prepare.sh \
|
||||
&& chmod 755 /sbin/docker-prepare.sh \
|
||||
&& mv wait-for-redis.py /sbin/wait-for-redis.py \
|
||||
&& chmod 755 /sbin/wait-for-redis.py \
|
||||
&& mv paperless_cmd.sh /usr/local/bin/paperless_cmd.sh \
|
||||
&& chmod 755 /usr/local/bin/paperless_cmd.sh \
|
||||
&& echo "Installing managment commands" \
|
||||
&& chmod +x install_management_commands.sh \
|
||||
&& ./install_management_commands.sh
|
||||
|
||||
# Install the built packages from the installer library images
|
||||
# Use mounts to avoid copying installer files into the image
|
||||
# These change sometimes
|
||||
RUN --mount=type=bind,from=qpdf-builder,target=/qpdf \
|
||||
--mount=type=bind,from=psycopg2-builder,target=/psycopg2 \
|
||||
--mount=type=bind,from=pikepdf-builder,target=/pikepdf \
|
||||
set -eux \
|
||||
&& echo "Installing qpdf" \
|
||||
&& apt-get install --yes --no-install-recommends /qpdf/usr/src/qpdf/libqpdf28_*.deb \
|
||||
&& apt-get install --yes --no-install-recommends /qpdf/usr/src/qpdf/qpdf_*.deb \
|
||||
&& echo "Installing pikepdf and dependencies" \
|
||||
&& python3 -m pip install --no-cache-dir /pikepdf/usr/src/wheels/pyparsing*.whl \
|
||||
&& python3 -m pip install --no-cache-dir /pikepdf/usr/src/wheels/packaging*.whl \
|
||||
&& python3 -m pip install --no-cache-dir /pikepdf/usr/src/wheels/lxml*.whl \
|
||||
&& python3 -m pip install --no-cache-dir /pikepdf/usr/src/wheels/Pillow*.whl \
|
||||
&& python3 -m pip install --no-cache-dir /pikepdf/usr/src/wheels/pikepdf*.whl \
|
||||
&& python3 -m pip list \
|
||||
&& echo "Installing psycopg2" \
|
||||
&& python3 -m pip install --no-cache-dir /psycopg2/usr/src/wheels/psycopg2*.whl \
|
||||
&& python3 -m pip list
|
||||
|
||||
WORKDIR /usr/src/paperless/src/
|
||||
|
||||
COPY requirements.txt ../
|
||||
|
||||
# Python dependencies
|
||||
RUN apt-get update \
|
||||
&& apt-get -y --no-install-recommends install \
|
||||
build-essential \
|
||||
libpq-dev \
|
||||
libqpdf-dev \
|
||||
&& python3 -m pip install --upgrade --no-cache-dir supervisor \
|
||||
&& python3 -m pip install --no-cache-dir -r ../requirements.txt \
|
||||
&& apt-get -y purge build-essential libqpdf-dev \
|
||||
&& apt-get -y autoremove --purge \
|
||||
&& rm -rf /var/lib/apt/lists/*
|
||||
# Change pretty frequently
|
||||
COPY Pipfile* ./
|
||||
|
||||
# setup docker-specific things
|
||||
COPY docker/ ./docker/
|
||||
# Packages needed only for building a few quick Python
|
||||
# dependencies
|
||||
ARG BUILD_PACKAGES="\
|
||||
build-essential \
|
||||
git \
|
||||
default-libmysqlclient-dev \
|
||||
python3-dev"
|
||||
|
||||
RUN cd docker \
|
||||
&& cp imagemagick-policy.xml /etc/ImageMagick-6/policy.xml \
|
||||
&& mkdir /var/log/supervisord /var/run/supervisord \
|
||||
&& cp supervisord.conf /etc/supervisord.conf \
|
||||
&& cp docker-entrypoint.sh /sbin/docker-entrypoint.sh \
|
||||
&& cp docker-prepare.sh /sbin/docker-prepare.sh \
|
||||
&& chmod 755 /sbin/docker-entrypoint.sh \
|
||||
&& chmod +x install_management_commands.sh \
|
||||
&& ./install_management_commands.sh \
|
||||
&& cd .. \
|
||||
&& rm docker -rf
|
||||
RUN set -eux \
|
||||
&& echo "Installing build system packages" \
|
||||
&& apt-get update \
|
||||
&& apt-get install --yes --quiet --no-install-recommends ${BUILD_PACKAGES} \
|
||||
&& python3 -m pip install --no-cache-dir --upgrade wheel \
|
||||
&& echo "Installing pipenv" \
|
||||
&& python3 -m pip install --no-cache-dir --upgrade pipenv \
|
||||
&& echo "Installing Python requirements" \
|
||||
# pipenv tries to be too fancy and prints so much junk
|
||||
&& pipenv requirements > requirements.txt \
|
||||
&& python3 -m pip install --default-timeout=1000 --no-cache-dir --requirement requirements.txt \
|
||||
&& rm requirements.txt \
|
||||
&& echo "Cleaning up image" \
|
||||
&& apt-get -y purge ${BUILD_PACKAGES} \
|
||||
&& apt-get -y autoremove --purge \
|
||||
&& apt-get clean --yes \
|
||||
# Remove pipenv and its unique packages
|
||||
&& python3 -m pip uninstall --yes \
|
||||
pipenv \
|
||||
distlib \
|
||||
platformdirs \
|
||||
virtualenv \
|
||||
virtualenv-clone \
|
||||
&& rm -rf /var/lib/apt/lists/* \
|
||||
&& rm -rf /tmp/* \
|
||||
&& rm -rf /var/tmp/* \
|
||||
&& rm -rf /var/cache/apt/archives/* \
|
||||
&& truncate -s 0 /var/log/*log
|
||||
|
||||
COPY gunicorn.conf.py ../
|
||||
# copy backend
|
||||
COPY ./src ./
|
||||
|
||||
# copy app
|
||||
COPY src/ ./
|
||||
# copy frontend
|
||||
COPY --from=compile-frontend /src/src/documents/static/frontend/ ./documents/static/frontend/
|
||||
|
||||
# add users, setup scripts
|
||||
RUN addgroup --gid 1000 paperless \
|
||||
&& useradd --uid 1000 --gid paperless --home-dir /usr/src/paperless paperless \
|
||||
&& chown -R paperless:paperless ../ \
|
||||
&& gosu paperless python3 manage.py collectstatic --clear --no-input \
|
||||
&& gosu paperless python3 manage.py compilemessages
|
||||
RUN set -eux \
|
||||
&& addgroup --gid 1000 paperless \
|
||||
&& useradd --uid 1000 --gid paperless --home-dir /usr/src/paperless paperless \
|
||||
&& chown -R paperless:paperless ../ \
|
||||
&& gosu paperless python3 manage.py collectstatic --clear --no-input \
|
||||
&& gosu paperless python3 manage.py compilemessages
|
||||
|
||||
VOLUME ["/usr/src/paperless/data", \
|
||||
"/usr/src/paperless/media", \
|
||||
"/usr/src/paperless/consume", \
|
||||
"/usr/src/paperless/export"]
|
||||
|
||||
VOLUME ["/usr/src/paperless/data", "/usr/src/paperless/media", "/usr/src/paperless/consume", "/usr/src/paperless/export"]
|
||||
ENTRYPOINT ["/sbin/docker-entrypoint.sh"]
|
||||
EXPOSE 8000
|
||||
CMD ["/usr/local/bin/supervisord", "-c", "/etc/supervisord.conf"]
|
||||
|
||||
LABEL maintainer="Jonas Winkler <dev@jpwinkler.de>"
|
||||
EXPOSE 8000
|
||||
|
||||
CMD ["/usr/local/bin/paperless_cmd.sh"]
|
||||
|
50
Pipfile
@@ -9,35 +9,37 @@ verify_ssl = true
|
||||
name = "piwheels"
|
||||
|
||||
[packages]
|
||||
dateparser = "~=1.0.0"
|
||||
django = "~=3.2"
|
||||
dateparser = "~=1.1"
|
||||
django = "~=4.0"
|
||||
django-cors-headers = "*"
|
||||
django-extensions = "*"
|
||||
django-filter = "~=2.4.0"
|
||||
django-q = "~=1.3.4"
|
||||
djangorestframework = "~=3.12.2"
|
||||
django-filter = "~=22.1"
|
||||
django-q = {editable = true, ref = "paperless-main", git = "https://github.com/paperless-ngx/django-q.git"}
|
||||
djangorestframework = "~=3.13"
|
||||
filelock = "*"
|
||||
fuzzywuzzy = {extras = ["speedup"], version = "*"}
|
||||
gunicorn = "*"
|
||||
imap-tools = "*"
|
||||
langdetect = "*"
|
||||
numpy = "~=1.20.0"
|
||||
pathvalidate = "*"
|
||||
pillow = "~=8.1"
|
||||
pikepdf = "~=2.5"
|
||||
pillow = "~=9.2"
|
||||
pikepdf = "~=5.6"
|
||||
python-gnupg = "*"
|
||||
python-dotenv = "*"
|
||||
python-dateutil = "*"
|
||||
python-magic = "*"
|
||||
psycopg2-binary = "*"
|
||||
psycopg2 = "*"
|
||||
redis = "*"
|
||||
# Pinned because aarch64 wheels and updates cause warnings when loading the classifier model.
|
||||
scikit-learn="==0.24.0"
|
||||
whitenoise = "~=5.3.0"
|
||||
watchdog = "~=2.1.0"
|
||||
whoosh="~=2.7.4"
|
||||
inotifyrecursive = "~=0.3.4"
|
||||
ocrmypdf = "~=12.3"
|
||||
scikit-learn = "~=1.1"
|
||||
# Pin this until piwheels is building 1.9 (see https://www.piwheels.org/project/scipy/)
|
||||
scipy = "==1.8.1"
|
||||
# https://github.com/paperless-ngx/paperless-ngx/issues/1364
|
||||
numpy = "==1.22.3"
|
||||
whitenoise = "~=6.2"
|
||||
watchdog = "~=2.1"
|
||||
whoosh="~=2.7"
|
||||
inotifyrecursive = "~=0.3"
|
||||
ocrmypdf = "~=13.7"
|
||||
tqdm = "*"
|
||||
tika = "*"
|
||||
# TODO: This will sadly also install daphne+dependencies,
|
||||
@@ -46,11 +48,13 @@ channels = "~=3.0"
|
||||
channels-redis = "*"
|
||||
uvicorn = {extras = ["standard"], version = "*"}
|
||||
concurrent-log-handler = "*"
|
||||
# uvloop 0.15+ incompatible with python 3.6
|
||||
uvloop = "~=0.15"
|
||||
cryptography = "~=3.4"
|
||||
"pdfminer.six" = "*"
|
||||
"backports.zoneinfo" = "*"
|
||||
"backports.zoneinfo" = {version = "*", markers = "python_version < '3.9'"}
|
||||
"importlib-resources" = {version = "*", markers = "python_version < '3.9'"}
|
||||
zipp = {version = "*", markers = "python_version < '3.9'"}
|
||||
pyzbar = "*"
|
||||
mysqlclient = "*"
|
||||
setproctitle = "*"
|
||||
|
||||
[dev-packages]
|
||||
coveralls = "*"
|
||||
@@ -62,6 +66,10 @@ pytest-django = "*"
|
||||
pytest-env = "*"
|
||||
pytest-sugar = "*"
|
||||
pytest-xdist = "*"
|
||||
sphinx = "~=3.4.2"
|
||||
sphinx = "~=5.1"
|
||||
sphinx_rtd_theme = "*"
|
||||
tox = "*"
|
||||
black = "*"
|
||||
pre-commit = "*"
|
||||
sphinx-autobuild = "*"
|
||||
myst-parser = "*"
|
||||
|
2514
Pipfile.lock
generated
158
README.md
@@ -1,116 +1,120 @@
|
||||
[](https://github.com/jonaswinkler/paperless-ng/actions)
|
||||

|
||||
[](https://crowdin.com/project/paperless-ng)
|
||||
[](https://paperless-ng.readthedocs.io/en/latest/?badge=latest)
|
||||
[](https://gitter.im/paperless-ng/community?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge&utm_content=badge)
|
||||
[](https://hub.docker.com/r/jonaswinkler/paperless-ng)
|
||||
[](https://coveralls.io/github/jonaswinkler/paperless-ng?branch=master)
|
||||
[](https://github.com/paperless-ngx/paperless-ngx/actions)
|
||||
[](https://crowdin.com/project/paperless-ngx)
|
||||
[](https://paperless-ngx.readthedocs.io/en/latest/?badge=latest)
|
||||
[](https://coveralls.io/github/paperless-ngx/paperless-ngx?branch=master)
|
||||
[](https://matrix.to/#/#paperless:adnidor.de)
|
||||
|
||||
# Paperless-ng
|
||||
<p align="center">
|
||||
<img src="https://github.com/paperless-ngx/paperless-ngx/raw/main/resources/logo/web/png/Black%20logo%20-%20no%20background.png#gh-light-mode-only" width="50%" />
|
||||
<img src="https://github.com/paperless-ngx/paperless-ngx/raw/main/resources/logo/web/png/White%20logo%20-%20no%20background.png#gh-dark-mode-only" width="50%" />
|
||||
</p>
|
||||
|
||||
[Paperless (click me)](https://github.com/the-paperless-project/paperless) is an application by Daniel Quinn and contributors that indexes your scanned documents and allows you to easily search for documents and store metadata alongside your documents.
|
||||
<!-- omit in toc -->
|
||||
|
||||
Paperless-ng is a fork of the original project, adding a new interface and many other changes under the hood. These key points should help you decide whether Paperless-ng is something you would prefer over Paperless:
|
||||
# Paperless-ngx
|
||||
|
||||
* Interface: The new front end is the main interface for Paperless-ng, the old interface still exists but most customizations (such as thumbnails for the document list) have been removed.0
|
||||
* Encryption: Paperless-ng does not support GnuPG anymore, since storing your data on encrypted file systems (that you optionally mount on demand) achieves about the same result.
|
||||
* Resource usage: Paperless-ng does use a bit more resources than Paperless. Running the web server requires about 300MB of RAM or more, depending on the configuration. While adding documents, it requires about 300MB additional RAM, depending on the document. It still runs on Raspberry Pi (many users do that), but it has been generally geared to better use the resources of more powerful systems.
|
||||
* API changes: If you rely on the REST API of paperless, some of its functionality has been changed.
|
||||
Paperless-ngx is a document management system that transforms your physical documents into a searchable online archive so you can keep, well, _less paper_.
|
||||
|
||||
For a detailed list of changes, have a look at the [change log](https://paperless-ng.readthedocs.io/en/latest/changelog.html) in the documentation, especially the section about the [0.9.0 release](https://paperless-ng.readthedocs.io/en/latest/changelog.html#paperless-ng-0-9-0).
|
||||
Paperless-ngx forked from [paperless-ng](https://github.com/jonaswinkler/paperless-ng) to continue the great work and distribute responsibility of supporting and advancing the project among a team of people. [Consider joining us!](#community-support) Discussion of this transition can be found in issues
|
||||
[#1599](https://github.com/jonaswinkler/paperless-ng/issues/1599) and [#1632](https://github.com/jonaswinkler/paperless-ng/issues/1632).
|
||||
|
||||
# How it Works
|
||||
A demo is available at [demo.paperless-ngx.com](https://demo.paperless-ngx.com) using login `demo` / `demo`. _Note: demo content is reset frequently and confidential information should not be uploaded._
|
||||
|
||||
Paperless does not control your scanner, it only helps you deal with what your scanner produces.
|
||||
|
||||
1. Buy a document scanner that can write to a place on your network. If you need some inspiration, have a look at the [scanner recommendations](https://paperless-ng.readthedocs.io/en/latest/scanners.html) page. Set it up to "scan to FTP" or something similar. It should be able to push scanned images to a server without you having to do anything. Of course if your scanner doesn't know how to automatically upload the file somewhere, you can always do that manually. Paperless doesn't care how the documents get into its local consumption directory.
|
||||
|
||||
- Alternatively, you can use any of the mobile scanning apps out there. We have an app that allows you to share documents with paperless, if you're on Android. See the section on affiliated projects below.
|
||||
|
||||
2. Wait for paperless to process your files. OCR is expensive, and depending on the power of your machine, this might take a bit of time.
|
||||
3. Use the web frontend to sift through the database and find what you want.
|
||||
4. Download the PDF you need/want via the web interface and do whatever you like with it. You can even print it and send it as if it's the original. In most cases, no one will care or notice.
|
||||
|
||||
Here's what you get:
|
||||
|
||||

|
||||
|
||||
If you want to see paperless-ng in action, [more screenshots are available in the documentation](https://paperless-ng.readthedocs.io/en/latest/screenshots.html).
|
||||
- [Features](#features)
|
||||
- [Getting started](#getting-started)
|
||||
- [Contributing](#contributing)
|
||||
- [Community Support](#community-support)
|
||||
- [Translation](#translation)
|
||||
- [Feature Requests](#feature-requests)
|
||||
- [Bugs](#bugs)
|
||||
- [Affiliated Projects](#affiliated-projects)
|
||||
- [Important Note](#important-note)
|
||||
|
||||
# Features
|
||||
|
||||
* Performs OCR on your documents, adds selectable text to image only documents and adds tags, correspondents and document types to your documents.
|
||||
* Supports PDF documents, images, plain text files, and Office documents (Word, Excel, Powerpoint, and LibreOffice equivalents).
|
||||
* Office document support is optional and provided by Apache Tika (see [configuration](https://paperless-ng.readthedocs.io/en/latest/configuration.html#tika-settings))
|
||||
* Paperless stores your documents plain on disk. Filenames and folders are managed by paperless and their format can be configured freely.
|
||||
* Single page application front end.
|
||||
* Includes a dashboard that shows basic statistics and has document upload.
|
||||
* Filtering by tags, correspondents, types, and more.
|
||||
* Customizable views can be saved and displayed on the dashboard.
|
||||
* Full text search helps you find what you need.
|
||||
* Auto completion suggests relevant words from your documents.
|
||||
* Results are sorted by relevance to your search query.
|
||||
* Highlighting shows you which parts of the document matched the query.
|
||||
* Searching for similar documents ("More like this")
|
||||
* Email processing: Paperless adds documents from your email accounts.
|
||||
* Configure multiple accounts and filters for each account.
|
||||
* When adding documents from mail, paperless can move these mail to a new folder, mark them as read, flag them as important or delete them.
|
||||
* Machine learning powered document matching.
|
||||
* Paperless learns from your documents and will be able to automatically assign tags, correspondents and types to documents once you've stored a few documents in paperless.
|
||||
* Optimized for multi core systems: Paperless-ng consumes multiple documents in parallel.
|
||||
* The integrated sanity checker makes sure that your document archive is in good health.
|
||||

|
||||

|
||||
|
||||
- Organize and index your scanned documents with tags, correspondents, types, and more.
|
||||
- Performs OCR on your documents, adds selectable text to image only documents and adds tags, correspondents and document types to your documents.
|
||||
- Supports PDF documents, images, plain text files, and Office documents (Word, Excel, Powerpoint, and LibreOffice equivalents).
|
||||
- Office document support is optional and provided by Apache Tika (see [configuration](https://paperless-ngx.readthedocs.io/en/latest/configuration.html#tika-settings))
|
||||
- Paperless stores your documents plain on disk. Filenames and folders are managed by paperless and their format can be configured freely.
|
||||
- Single page application front end.
|
||||
- Includes a dashboard that shows basic statistics and has document upload.
|
||||
- Filtering by tags, correspondents, types, and more.
|
||||
- Customizable views can be saved and displayed on the dashboard.
|
||||
- Full text search helps you find what you need.
|
||||
- Auto completion suggests relevant words from your documents.
|
||||
- Results are sorted by relevance to your search query.
|
||||
- Highlighting shows you which parts of the document matched the query.
|
||||
- Searching for similar documents ("More like this")
|
||||
- Email processing: Paperless adds documents from your email accounts.
|
||||
- Configure multiple accounts and filters for each account.
|
||||
- When adding documents from mail, paperless can move these mail to a new folder, mark them as read, flag them as important or delete them.
|
||||
- Machine learning powered document matching.
|
||||
- Paperless-ngx learns from your documents and will be able to automatically assign tags, correspondents and types to documents once you've stored a few documents in paperless.
|
||||
- Optimized for multi core systems: Paperless-ngx consumes multiple documents in parallel.
|
||||
- The integrated sanity checker makes sure that your document archive is in good health.
|
||||
- [More screenshots are available in the documentation](https://paperless-ngx.readthedocs.io/en/latest/screenshots.html).
|
||||
|
||||
# Getting started
|
||||
|
||||
The recommended way to deploy paperless is docker-compose. The files in the /docker/hub directory are configured to pull the image from Docker Hub.
|
||||
The easiest way to deploy paperless is docker-compose. The files in the [`/docker/compose` directory](https://github.com/paperless-ngx/paperless-ngx/tree/main/docker/compose) are configured to pull the image from Github Packages.
|
||||
|
||||
Read the [documentation](https://paperless-ng.readthedocs.io/en/latest/setup.html#installation) on how to get started.
|
||||
If you'd like to jump right in, you can configure a docker-compose environment with our install script:
|
||||
|
||||
Alternatively, you can install the dependencies and setup apache and a database server yourself. The documenation has a step by step guide on how to do it. Consider giving the Ansible role a shot, this essentially automates the entire bare metal installation process.
|
||||
```bash
|
||||
bash -c "$(curl -L https://raw.githubusercontent.com/paperless-ngx/paperless-ngx/main/install-paperless-ngx.sh)"
|
||||
```
|
||||
|
||||
# Migrating from Paperless to Paperless-ng
|
||||
Alternatively, you can install the dependencies and setup apache and a database server yourself. The [documentation](https://paperless-ngx.readthedocs.io/en/latest/setup.html#installation) has a step by step guide on how to do it.
|
||||
|
||||
Read the section about [migration](https://paperless-ng.readthedocs.io/en/latest/setup.html#migration-to-paperless-ng) in the documentation. Its also entirely possible to go back to Paperless by reverting the database migrations.
|
||||
Migrating from Paperless-ng is easy, just drop in the new docker image! See the [documentation on migrating](https://paperless-ngx.readthedocs.io/en/latest/setup.html#migrating-from-paperless-ng) for more details.
|
||||
|
||||
# Documentation
|
||||
<!-- omit in toc -->
|
||||
|
||||
The documentation for Paperless-ng is available on [ReadTheDocs](https://paperless-ng.readthedocs.io/).
|
||||
### Documentation
|
||||
|
||||
# Translation
|
||||
The documentation for Paperless-ngx is available on [ReadTheDocs](https://paperless-ngx.readthedocs.io/).
|
||||
|
||||
Paperless is available in many different languages. Translation is coordinated at crowdin. If you want to help out by translating paperless into your language, please head over to https://github.com/jonaswinkler/paperless-ng/issues/212 for details!
|
||||
# Contributing
|
||||
|
||||
# Feature Requests
|
||||
If you feel like contributing to the project, please do! Bug fixes, enhancements, visual fixes etc. are always welcome. If you want to implement something big: Please start a discussion about that! The [documentation](https://paperless-ngx.readthedocs.io/en/latest/extending.html) has some basic information on how to get started.
|
||||
|
||||
Feature requests can be submitted via [GitHub Discussions](https://github.com/jonaswinkler/paperless-ng/discussions/categories/feature-requests), you can search for existing ideas, add your own and vote for the ones you care about! Note that some older feature requests can also be found under [issues](https://github.com/jonaswinkler/paperless-ng/issues).
|
||||
## Community Support
|
||||
|
||||
# Questions? Something not working?
|
||||
People interested in continuing the work on paperless-ngx are encouraged to reach out here on github and in the [Matrix Room](https://matrix.to/#/#paperless:adnidor.de). If you would like to contribute to the project on an ongoing basis there are multiple [teams](https://github.com/orgs/paperless-ngx/people) (frontend, ci/cd, etc) that could use your help so please reach out!
|
||||
|
||||
For bugs please [open an issue](https://github.com/jonaswinkler/paperless-ng/issues) or [start a discussion](https://github.com/jonaswinkler/paperless-ng/discussions) if you have questions.
|
||||
## Translation
|
||||
|
||||
## Feel like helping out?
|
||||
Paperless-ngx is available in many languages that are coordinated on Crowdin. If you want to help out by translating paperless-ngx into your language, please head over to https://crwd.in/paperless-ngx, and thank you! More details can be found in [CONTRIBUTING.md](https://github.com/paperless-ngx/paperless-ngx/blob/main/CONTRIBUTING.md#translating-paperless-ngx).
|
||||
|
||||
There's still lots of things to be done, just have a look at open issues & discussions. If you feel like contributing to the project, please do! Bug fixes and improvements to the front end (I just can't seem to get some of these CSS things right) are always welcome. The documentation has some basic information on how to get started.
|
||||
## Feature Requests
|
||||
|
||||
If you want to implement something big: Please start a discussion about that! Maybe I've already had something similar in mind and we can make it happen together. However, keep in mind that the general roadmap is to make the existing features stable and get them tested.
|
||||
Feature requests can be submitted via [GitHub Discussions](https://github.com/paperless-ngx/paperless-ngx/discussions/categories/feature-requests), you can search for existing ideas, add your own and vote for the ones you care about.
|
||||
|
||||
## Bugs
|
||||
|
||||
For bugs please [open an issue](https://github.com/paperless-ngx/paperless-ngx/issues) or [start a discussion](https://github.com/paperless-ngx/paperless-ngx/discussions) if you have questions.
|
||||
|
||||
# Affiliated Projects
|
||||
|
||||
Paperless has been around a while now, and people are starting to build stuff on top of it. If you're one of those people, we can add your project to this list:
|
||||
Paperless has been around a while now, and people are starting to build stuff on top of it. If you're one of those people, we can add your project to this list:
|
||||
|
||||
* [Paperless App](https://github.com/bauerj/paperless_app): An Android/iOS app for Paperless. Updated to work with paperless-ng.
|
||||
* [Paperless Share](https://github.com/qcasey/paperless_share). Share any files from your Android application with paperless. Very simple, but works with all of the mobile scanning apps out there that allow you to share scanned documents.
|
||||
* [Scan to Paperless](https://github.com/sbrunner/scan-to-paperless): Scan and prepare (crop, deskew, OCR, ...) your documents for Paperless.
|
||||
- [Paperless App](https://github.com/bauerj/paperless_app): An Android/iOS app for Paperless-ngx. Also works with the original Paperless and Paperless-ng.
|
||||
- [Paperless Share](https://github.com/qcasey/paperless_share). Share any files from your Android application with paperless. Very simple, but works with all of the mobile scanning apps out there that allow you to share scanned documents.
|
||||
- [Scan to Paperless](https://github.com/sbrunner/scan-to-paperless): Scan and prepare (crop, deskew, OCR, ...) your documents for Paperless.
|
||||
|
||||
These projects also exist, but their status and compatibility with paperless-ng is unknown.
|
||||
These projects also exist, but their status and compatibility with paperless-ngx is unknown.
|
||||
|
||||
* [paperless-cli](https://github.com/stgarf/paperless-cli): A golang command line binary to interact with a Paperless instance.
|
||||
- [paperless-cli](https://github.com/stgarf/paperless-cli): A golang command line binary to interact with a Paperless instance.
|
||||
|
||||
This project also exists, but needs updates to be compatile with paperless-ng.
|
||||
This project also exists, but needs updates to be compatible with paperless-ngx.
|
||||
|
||||
* [Paperless Desktop](https://github.com/thomasbrueggemann/paperless-desktop): A desktop UI for your Paperless installation. Runs on Mac, Linux, and Windows.
|
||||
Known issues on Mac: (Could not load reminders and documents)
|
||||
- [Paperless Desktop](https://github.com/thomasbrueggemann/paperless-desktop): A desktop UI for your Paperless installation. Runs on Mac, Linux, and Windows.
|
||||
Known issues on Mac: (Could not load reminders and documents)
|
||||
|
||||
# Important Note
|
||||
|
||||
Document scanners are typically used to scan sensitive documents. Things like your social insurance number, tax records, invoices, etc. Everything is stored in the clear without encryption. This means that Paperless should never be run on an untrusted host. Instead, I recommend that if you do want to use it, run it locally on a server in your own home.
|
||||
Document scanners are typically used to scan sensitive documents. Things like your social insurance number, tax records, invoices, etc. Everything is stored in the clear without encryption. This means that Paperless should never be run on an untrusted host. Instead, I recommend that if you do want to use it, run it locally on a server in your own home.
|
||||
|
@@ -1,116 +0,0 @@
|
||||
Ansible Role: paperless-ng
|
||||
==========================
|
||||
|
||||
Installs and configures paperless-ng EDMS on Debian/Ubuntu systems.
|
||||
|
||||
Requirements
|
||||
------------
|
||||
|
||||
No special system requirements. Ansible 2.7 or newer is required.
|
||||
|
||||
Note that this role requires root access, so either run it in a playbook with a global `become: yes`, or invoke the role in your playbook like:
|
||||
|
||||
- hosts: all
|
||||
roles:
|
||||
- role: paperless-ng
|
||||
become: yes
|
||||
|
||||
Role Variables
|
||||
--------------
|
||||
|
||||
Most configuration variables from paperless-ng itself are available and accept their respective arguments.
|
||||
Every `PAPERLESS_*` configuration variable is lowercased and instead prefixed with `paperlessng_*` in `defaults/main.yml`.
|
||||
|
||||
For a full listing including explanations and allowed values, see the current [documentation](https://paperless-ng.readthedocs.io/en/latest/configuration.html).
|
||||
|
||||
Additional variables available in this role are listed below, along with default values:
|
||||
|
||||
paperlessng_version: latest
|
||||
|
||||
The [release](https://github.com/jonaswinkler/paperless-ng/releases) archive version of paperless-ng to install.
|
||||
`latest` stands for the latest release of paperless-ng.
|
||||
To install a specific version of paperless-ng, use the tag name of the release, e. g. `ng-1.4.4`, or specify a branch or commit id.
|
||||
|
||||
paperlessng_redis_host: localhost
|
||||
paperlessng_redis_port: 6379
|
||||
|
||||
Separate configuration values that combine into `PAPERLESS_REDIS`.
|
||||
|
||||
paperlessng_db_type: sqlite
|
||||
|
||||
Database to use. Default is file-based SQLite.
|
||||
|
||||
paperlessng_db_host: localhost
|
||||
paperlessng_db_port: 5432
|
||||
paperlessng_db_name: paperlessng
|
||||
paperlessng_db_user: paperlessng
|
||||
paperlessng_db_pass: paperlessng
|
||||
paperlessng_db_sslmode: prefer
|
||||
|
||||
Database configuration (only applicable if `paperlessng_db_type == 'postgresql'`).
|
||||
|
||||
paperlessng_directory: /opt/paperless-ng
|
||||
|
||||
Root directory paperless-ng is installed into.
|
||||
|
||||
paperlessng_virtualenv: "{{ paperlessng_directory }}/.venv"
|
||||
|
||||
Directory used for the virtual environment for paperless-ng.
|
||||
|
||||
paperlessng_ocr_languages:
|
||||
- eng
|
||||
|
||||
List of OCR languages to install and configure (`apt search tesseract-ocr-*`).
|
||||
|
||||
paperlessng_use_jbig2enc: True
|
||||
|
||||
Whether to install and use [jbig2enc](https://github.com/agl/jbig2enc) for OCRmyPDF.
|
||||
|
||||
paperlessng_big2enc_lossy: False
|
||||
|
||||
Whether to use jbig2enc's lossy compression mode.
|
||||
|
||||
paperlessng_superuser_name: paperlessng
|
||||
paperlessng_superuser_email: paperlessng@example.com
|
||||
paperlessng_superuser_password: paperlessng
|
||||
|
||||
Credentials of the initial superuser in paperless-ng.
|
||||
|
||||
paperlessng_system_user: paperlessng
|
||||
paperlessng_system_group: paperlessng
|
||||
|
||||
System user and group to run the paperless-ng services as (will be created if required).
|
||||
|
||||
paperlessng_listen_address: 127.0.0.1
|
||||
paperlessng_listen_port: 8000
|
||||
|
||||
Address and port for the paperless-ng service to listen on.
|
||||
|
||||
Dependencies
|
||||
------------
|
||||
|
||||
No ansible dependencies.
|
||||
|
||||
Example Playbook
|
||||
----------------
|
||||
`playbook.yml`:
|
||||
|
||||
- hosts: all
|
||||
become: yes
|
||||
vars_files:
|
||||
- vars/paperless-ng.yml
|
||||
roles:
|
||||
- paperless-ng
|
||||
|
||||
`vars/paperless-ng.yml`:
|
||||
|
||||
paperlessng_media_root: /mnt/media/smbshare
|
||||
|
||||
paperlessng_db_type: postgresql
|
||||
paperlessng_db_pass: PLEASEPROVIDEASTRONGPASSWORDHERE
|
||||
|
||||
paperlessng_secret_key: AGAINPLEASECHANGETHISNOW
|
||||
|
||||
paperlessng_ocr_languages:
|
||||
- eng
|
||||
- deu
|
@@ -1,83 +0,0 @@
|
||||
---
|
||||
paperlessng_version: latest # 'latest', release number, or github branch/tag/commit/ref
|
||||
|
||||
# Required services
|
||||
paperlessng_redis_host: localhost
|
||||
paperlessng_redis_port: 6379
|
||||
paperlessng_db_type: sqlite # or postgresql
|
||||
# Below entries only apply for paperlessng_db_type=='postgresql'
|
||||
paperlessng_db_host: localhost
|
||||
paperlessng_db_port: 5432
|
||||
paperlessng_db_name: paperlessng
|
||||
paperlessng_db_user: paperlessng
|
||||
paperlessng_db_pass: paperlessng
|
||||
paperlessng_db_sslmode: prefer
|
||||
|
||||
# Paths and folders
|
||||
paperlessng_directory: /opt/paperless-ng
|
||||
paperlessng_consumption_dir: "{{ paperlessng_directory }}/consumption"
|
||||
paperlessng_data_dir: "{{ paperlessng_directory }}/data"
|
||||
paperlessng_media_root: "{{ paperlessng_directory }}/media"
|
||||
paperlessng_staticdir: "{{ paperlessng_directory }}/static"
|
||||
paperlessng_filename_format:
|
||||
paperlessng_logging_dir: "{{ paperlessng_data_dir }}/log"
|
||||
paperlessng_virtualenv: "{{ paperlessng_directory }}/.venv"
|
||||
|
||||
# Hosting & Security
|
||||
paperlessng_secret_key: PLEASECHANGETHISFORTHELOVEOFGOD
|
||||
paperlessng_allowed_hosts: "*"
|
||||
paperlessng_cors_allowed_hosts: http://localhost:8000
|
||||
paperlessng_force_script_name:
|
||||
paperlessng_static_url: /static/
|
||||
paperlessng_auto_login_username:
|
||||
paperlessng_cookie_prefix: ""
|
||||
paperlessng_enable_http_remote_user: False
|
||||
|
||||
# OCR settings
|
||||
paperlessng_ocr_languages:
|
||||
- eng
|
||||
paperlessng_ocr_mode: skip
|
||||
paperlessng_ocr_clean: clean
|
||||
paperlessng_ocr_deskew: True
|
||||
paperlessng_ocr_rotate_pages: True
|
||||
paperlessng_ocr_rotate_pages_threshold: 12
|
||||
paperlessng_ocr_output_type: pdfa
|
||||
paperlessng_ocr_pages: 0
|
||||
paperlessng_ocr_image_dpi:
|
||||
# see https://ocrmypdf.readthedocs.io/en/latest/api.html#ocrmypdf.ocr
|
||||
paperlessng_ocr_user_args:
|
||||
- "optimize": 1
|
||||
paperlessng_use_jbig2enc: True
|
||||
paperlessng_big2enc_lossy: False
|
||||
|
||||
# Tika settings
|
||||
paperlessng_tika_enabled: False
|
||||
paperlessng_tika_endpoint: http://localhost:9998
|
||||
paperlessng_tika_gotenberg_endpoint: http://localhost:3000
|
||||
|
||||
# Software tweaks
|
||||
paperlessng_time_zone: Europe/Berlin
|
||||
paperlessng_consumer_polling: 0
|
||||
paperlessng_consumer_delete_duplicates: False
|
||||
paperlessng_consumer_recursive: False
|
||||
paperlessng_consumer_subdirs_as_tags: False
|
||||
paperlessng_convert_memory_limit: 0
|
||||
paperlessng_convert_tmpdir:
|
||||
paperlessng_optimize_thumbnails: True
|
||||
paperlessng_post_consume_script:
|
||||
paperlessng_filename_date_order:
|
||||
paperlessng_thumbnail_font_name: /usr/share/fonts/liberation/LiberationSerif-Regular.ttf
|
||||
paperlessng_ignore_dates: ""
|
||||
|
||||
# Superuser settings
|
||||
paperlessng_superuser_name: paperlessng
|
||||
paperlessng_superuser_email: paperlessng@example.com
|
||||
paperlessng_superuser_password: paperlessng
|
||||
|
||||
# System user settings
|
||||
paperlessng_system_user: paperlessng
|
||||
paperlessng_system_group: paperlessng
|
||||
|
||||
# Webserver settings
|
||||
paperlessng_listen_address: 127.0.0.1
|
||||
paperlessng_listen_port: 8000
|
@@ -1,17 +0,0 @@
|
||||
dependencies: []
|
||||
|
||||
galaxy_info:
|
||||
author: C0nsultant
|
||||
description: Bare-metal deployment of paperless-ng DMS
|
||||
license: license (GPLv3)
|
||||
min_ansible_version: 2.7
|
||||
|
||||
platforms:
|
||||
- name: Debian
|
||||
versions:
|
||||
- buster
|
||||
- name: Ubuntu
|
||||
versions:
|
||||
- focal
|
||||
|
||||
galaxy_tags: [EDMS, django, python, web]
|
@@ -1,10 +0,0 @@
|
||||
---
|
||||
- name: update previous release to newest release
|
||||
hosts: all
|
||||
tasks:
|
||||
- name: set current github commit as version when available
|
||||
set_fact:
|
||||
paperlessng_version: "{{ lookup('env', 'TARGET_GITHUB_SHA') | default('master', True) }}"
|
||||
- name: update to newest paperless-ng release
|
||||
include_role:
|
||||
name: ansible
|
@@ -1,35 +0,0 @@
|
||||
---
|
||||
dependency:
|
||||
name: galaxy
|
||||
driver:
|
||||
name: docker
|
||||
platforms:
|
||||
- name: ubuntu_focal
|
||||
image: jrei/systemd-ubuntu:20.04
|
||||
privileged: true
|
||||
volumes:
|
||||
- /sys/fs/cgroup:/sys/fs/cgroup:ro
|
||||
tmpfs:
|
||||
- /tmp
|
||||
- /run
|
||||
- /run/lock
|
||||
override_command: False
|
||||
# ubuntu 18.04 bionic works except that
|
||||
# the default redis configuration expects IPv6 which is not enabled in docker by default
|
||||
# the default Python environment is configured for ASCII instead of UTF-8
|
||||
# ubuntu 16.04 xenial only has Python 3.5 which is EOL and breaks multiple dependencies
|
||||
- name: debian_buster
|
||||
image: jrei/systemd-debian:10
|
||||
privileged: true
|
||||
volumes:
|
||||
- /sys/fs/cgroup:/sys/fs/cgroup:ro
|
||||
tmpfs:
|
||||
- /tmp
|
||||
- /run
|
||||
- /run/lock
|
||||
override_command: False
|
||||
# debian 9 stretch only has Python 3.5 which is EOL and breaks multiple dependencies
|
||||
provisioner:
|
||||
name: ansible
|
||||
verifier:
|
||||
name: ansible
|
@@ -1,10 +0,0 @@
|
||||
- name: install previous release
|
||||
hosts: all
|
||||
tasks:
|
||||
- name: set previous version as installation target
|
||||
set_fact:
|
||||
paperlessng_version: latest
|
||||
|
||||
- name: install previous paperless-ng release
|
||||
include_role:
|
||||
name: ansible
|
@@ -1,94 +0,0 @@
|
||||
---
|
||||
- name: Verify
|
||||
hosts: all
|
||||
gather_facts: false
|
||||
|
||||
vars_files:
|
||||
- ../../defaults/main.yml
|
||||
|
||||
tasks:
|
||||
- name: check if webserver is up
|
||||
uri:
|
||||
url: "http://{{ paperlessng_listen_address }}:{{ paperlessng_listen_port }}"
|
||||
status_code: [200, 302]
|
||||
return_content: yes
|
||||
register: landingpage
|
||||
failed_when: "'Sign in</button>' not in landingpage.content"
|
||||
|
||||
- name: generate random name and content
|
||||
set_fact:
|
||||
content: "{{ lookup('password', '/dev/null length=65536 chars=ascii_letters') }}"
|
||||
filename: "{{ lookup('password', '/dev/null length=8 chars=ascii_letters') }}"
|
||||
|
||||
- name: check if document posting works
|
||||
uri:
|
||||
url: "http://{{ paperlessng_listen_address }}:{{ paperlessng_listen_port }}/api/documents/post_document/"
|
||||
method: POST
|
||||
body_format: form-multipart
|
||||
body:
|
||||
document:
|
||||
content: "{{ content }}"
|
||||
filename: "{{ filename }}.txt"
|
||||
headers:
|
||||
Authorization: 'Basic {{ (paperlessng_superuser_name + ":" + paperlessng_superuser_password) | b64encode }}'
|
||||
Content-Type: text/plain
|
||||
return_content: yes
|
||||
register: post_document
|
||||
failed_when: "'OK' not in post_document.content"
|
||||
|
||||
- name: verify uploaded document has been accepted
|
||||
uri:
|
||||
url: "http://{{ paperlessng_listen_address }}:{{ paperlessng_listen_port }}/api/logs/paperless/"
|
||||
headers:
|
||||
Authorization: 'Basic {{ (paperlessng_superuser_name + ":" + paperlessng_superuser_password) | b64encode }}'
|
||||
return_content: yes
|
||||
register: logs
|
||||
failed_when: "('Consuming ' + filename + '.txt') not in logs.content"
|
||||
|
||||
- name: sleep till consumption finished
|
||||
pause:
|
||||
seconds: 10
|
||||
|
||||
- name: verify uploaded document has been consumed
|
||||
uri:
|
||||
url: "http://{{ paperlessng_listen_address }}:{{ paperlessng_listen_port }}/api/logs/paperless/"
|
||||
headers:
|
||||
Authorization: 'Basic {{ (paperlessng_superuser_name + ":" + paperlessng_superuser_password) | b64encode }}'
|
||||
return_content: yes
|
||||
register: logs
|
||||
failed_when: "filename + ' consumption finished' not in logs.content"
|
||||
|
||||
- name: get documents
|
||||
uri:
|
||||
url: "http://{{ paperlessng_listen_address }}:{{ paperlessng_listen_port }}/api/documents/"
|
||||
headers:
|
||||
Authorization: 'Basic {{ (paperlessng_superuser_name + ":" + paperlessng_superuser_password) | b64encode }}'
|
||||
return_content: yes
|
||||
register: documents
|
||||
|
||||
- name: set document index
|
||||
set_fact:
|
||||
index: "{{ documents.json['results'][0]['id'] }}"
|
||||
|
||||
- name: verify uploaded document is avaiable
|
||||
uri:
|
||||
url: "http://{{ paperlessng_listen_address }}:{{ paperlessng_listen_port }}/api/documents/{{ index }}/"
|
||||
headers:
|
||||
Authorization: 'Basic {{ (paperlessng_superuser_name + ":" + paperlessng_superuser_password) | b64encode }}'
|
||||
return_content: yes
|
||||
register: document
|
||||
failed_when: "'Not found.' in document.content or content not in document.json['content']"
|
||||
|
||||
- name: check if deleting uploaded document works
|
||||
uri:
|
||||
url: "http://{{ paperlessng_listen_address }}:{{ paperlessng_listen_port }}/api/documents/bulk_edit/"
|
||||
method: POST
|
||||
body_format: json
|
||||
body:
|
||||
documents: ["{{ index }}"]
|
||||
method: delete
|
||||
parameters: {}
|
||||
headers:
|
||||
Authorization: 'Basic {{ (paperlessng_superuser_name + ":" + paperlessng_superuser_password) | b64encode }}'
|
||||
register: delete_document
|
||||
failed_when: "'OK' not in delete_document.json['result']"
|
@@ -1,6 +0,0 @@
|
||||
---
|
||||
- name: extract paperless-ng
|
||||
unarchive:
|
||||
src: "https://github.com/jonaswinkler/paperless-ng/releases/download/ng-{{ paperlessng_version }}/paperless-ng-{{ paperlessng_version }}.tar.xz"
|
||||
remote_src: yes
|
||||
dest: "{{ tempdir.path }}"
|
@@ -1,112 +0,0 @@
|
||||
---
|
||||
- name: install dev dependencies
|
||||
apt:
|
||||
pkg:
|
||||
- git
|
||||
- npm
|
||||
- gettext
|
||||
|
||||
- name: create output directories
|
||||
file:
|
||||
path: "{{ item }}"
|
||||
state: directory
|
||||
owner: "{{ paperlessng_system_user }}"
|
||||
group: "{{ paperlessng_system_group }}"
|
||||
mode: "750"
|
||||
with_items:
|
||||
- "{{ tempdir.path }}/paperless-ng"
|
||||
- "{{ tempdir.path }}/paperless-ng/scripts"
|
||||
|
||||
- block:
|
||||
- name: create temporary git directory
|
||||
tempfile:
|
||||
state: directory
|
||||
path: "{{ paperlessng_directory }}"
|
||||
register: gitdir
|
||||
|
||||
- name: pull paperless-ng
|
||||
git:
|
||||
repo: https://github.com/jonaswinkler/paperless-ng.git
|
||||
dest: "{{ gitdir.path }}"
|
||||
version: "{{ paperlessng_version }}"
|
||||
refspec: "+refs/pull/*:refs/pull/*"
|
||||
|
||||
- name: compile frontend
|
||||
command:
|
||||
cmd: "{{ item }}"
|
||||
args:
|
||||
chdir: "{{ gitdir.path }}/src-ui"
|
||||
failed_when: false
|
||||
with_items:
|
||||
- npm install -g @angular/cli
|
||||
- npm install
|
||||
- ./node_modules/.bin/ng build --prod
|
||||
|
||||
- name: copy application into place
|
||||
copy:
|
||||
src: "{{ gitdir.path }}/{{ item.src }}"
|
||||
remote_src: yes
|
||||
dest: "{{ tempdir.path }}/paperless-ng/{{ item.dest | default('') }}"
|
||||
with_items:
|
||||
- src: CONTRIBUTING.md
|
||||
- src: LICENSE
|
||||
- src: Pipfile
|
||||
- src: Pipfile.lock
|
||||
- src: README.md
|
||||
- src: requirements.txt
|
||||
- src: gunicorn.conf.py
|
||||
- src: paperless.conf.example
|
||||
dest: "paperless.conf"
|
||||
|
||||
- name: glob all scripts
|
||||
find:
|
||||
paths: ["{{ gitdir.path }}/scripts/"]
|
||||
patterns:
|
||||
- "*.service"
|
||||
- "*.sh"
|
||||
register: glob
|
||||
|
||||
- name: copy scripts
|
||||
copy:
|
||||
src: "{{ item.path }}"
|
||||
remote_src: yes
|
||||
dest: "{{ tempdir.path }}/paperless-ng/scripts/"
|
||||
with_items:
|
||||
- "{{ glob.files }}"
|
||||
|
||||
- name: copy sources
|
||||
command:
|
||||
cmd: "cp -r src/ {{ tempdir.path }}/paperless-ng/src"
|
||||
args:
|
||||
chdir: "{{ gitdir.path }}"
|
||||
|
||||
- name: create paperlessng venv
|
||||
command:
|
||||
cmd: "python3 -m virtualenv {{ gitdir.path }}/.venv/ -p /usr/bin/python3"
|
||||
|
||||
- name: install paperlessng requirements
|
||||
command:
|
||||
cmd: "{{ gitdir.path }}/.venv/bin/python3 -m pip install -r {{ gitdir.path }}/requirements.txt"
|
||||
|
||||
- name: compile messages
|
||||
command: "{{ gitdir.path }}/.venv/bin/python3 manage.py compilemessages"
|
||||
args:
|
||||
chdir: "{{ tempdir.path }}/paperless-ng/src/"
|
||||
|
||||
- name: collect static files
|
||||
command: "{{ gitdir.path }}/.venv/bin/python3 manage.py collectstatic --no-input"
|
||||
args:
|
||||
chdir: "{{ tempdir.path }}/paperless-ng/src/"
|
||||
|
||||
- name: remove pycache directories
|
||||
shell: find . -name __pycache__ | xargs rm -r
|
||||
args:
|
||||
chdir: "{{ tempdir.path }}"
|
||||
|
||||
- name: remove temporary git directory
|
||||
file:
|
||||
path: "{{ gitdir.path }}"
|
||||
state: absent
|
||||
|
||||
become: yes
|
||||
become_user: "{{ paperlessng_system_user }}"
|
@@ -1,553 +0,0 @@
|
||||
---
|
||||
- name: verify operating system
|
||||
fail:
|
||||
msg: Sorry, only Debian and Ubuntu supported at the moment.
|
||||
when: not(ansible_distribution == 'Debian' or ansible_distribution == 'Ubuntu')
|
||||
|
||||
- name: install base dependencies
|
||||
apt:
|
||||
update_cache: yes
|
||||
pkg:
|
||||
# paperless-ng
|
||||
- python3-pip
|
||||
- python3-dev
|
||||
- fonts-liberation
|
||||
- imagemagick
|
||||
- optipng
|
||||
- gnupg
|
||||
- libpq-dev
|
||||
- libmagic-dev
|
||||
- mime-support
|
||||
# OCRmyPDF
|
||||
- unpaper
|
||||
- ghostscript
|
||||
- icc-profiles-free
|
||||
- qpdf
|
||||
- liblept5
|
||||
- libxml2
|
||||
- pngquant
|
||||
- zlib1g
|
||||
- tesseract-ocr
|
||||
# dev
|
||||
- sudo
|
||||
- build-essential
|
||||
- python3-setuptools
|
||||
- python3-wheel
|
||||
|
||||
# upstream virtualenv in Ubuntu 20.04 is broken
|
||||
# https://github.com/pypa/virtualenv/issues/1873
|
||||
- name: install python virtualenv
|
||||
pip:
|
||||
name: virtualenv
|
||||
extra_args: --upgrade
|
||||
|
||||
- name: install ocr languages
|
||||
apt:
|
||||
pkg: "{{ paperlessng_ocr_languages | map('regex_replace', '^(.*)$', 'tesseract-ocr-\\1') | map('replace', '_', '-') | list }}"
|
||||
|
||||
- name: set up notesalexp repository key (for jbig2enc)
|
||||
apt_key:
|
||||
url: https://notesalexp.org/debian/alexp_key.asc
|
||||
state: present
|
||||
when: paperlessng_use_jbig2enc
|
||||
|
||||
- name: set up notesalexp repository (for jbig2enc)
|
||||
apt_repository:
|
||||
repo: "deb https://notesalexp.org/debian/{{ ansible_distribution_release }}/ {{ ansible_distribution_release }} main"
|
||||
state: present
|
||||
when: paperlessng_use_jbig2enc
|
||||
|
||||
- name: set up notesalexp repository pinning
|
||||
copy:
|
||||
content: |
|
||||
Package: *
|
||||
Pin: release o=notesalexp.org
|
||||
Pin-Priority: 1
|
||||
|
||||
Package: jbig2enc
|
||||
Pin: release o=notesalexp.org
|
||||
Pin-Priority: 500
|
||||
dest: /etc/apt/preferences.d/notesalexp
|
||||
when: paperlessng_use_jbig2enc
|
||||
|
||||
- name: install jbig2enc
|
||||
apt:
|
||||
pkg: jbig2enc
|
||||
update_cache: yes
|
||||
when: paperlessng_use_jbig2enc
|
||||
|
||||
- name: install redis
|
||||
apt:
|
||||
pkg: redis-server
|
||||
when: paperlessng_redis_host == 'localhost' or paperlessng_redis_host == '127.0.0.1'
|
||||
|
||||
- name: enable redis
|
||||
systemd:
|
||||
name: redis-server
|
||||
enabled: yes
|
||||
masked: no
|
||||
state: started
|
||||
when: paperlessng_redis_host == 'localhost' or paperlessng_redis_host == '127.0.0.1'
|
||||
|
||||
- name: create paperless system group
|
||||
group:
|
||||
name: "{{ paperlessng_system_group }}"
|
||||
|
||||
- name: create paperless system user
|
||||
user:
|
||||
name: "{{ paperlessng_system_user }}"
|
||||
groups:
|
||||
- "{{ paperlessng_system_group }}"
|
||||
shell: /usr/sbin/nologin
|
||||
# GNUPG_HOME required due to paperless db.py
|
||||
create_home: yes
|
||||
|
||||
- block:
|
||||
- name: get latest release version
|
||||
uri:
|
||||
url: https://api.github.com/repos/jonaswinkler/paperless-ng/releases/latest
|
||||
method: GET
|
||||
register: latest_release
|
||||
- name: parse latest release version
|
||||
set_fact:
|
||||
paperlessng_version: "{{ latest_release.json['tag_name'] }}"
|
||||
when: paperlessng_version == "latest"
|
||||
|
||||
- name: check if version is ref
|
||||
fail:
|
||||
msg: "Specifying `paperlessng_version` as git ref may not work as expected!"
|
||||
ignore_errors: True # Output failure (as warning), but don't consider play failed
|
||||
when: paperlessng_version.startswith('refs/')
|
||||
|
||||
- block:
|
||||
- name: sanitize version string
|
||||
set_fact:
|
||||
paperlessng_version: "{{ paperlessng_version | regex_replace('^ng-(\\d+\\.\\d+\\.\\d+)$', '\\1') }}"
|
||||
- name: get tag data
|
||||
uri:
|
||||
url: https://api.github.com/repos/jonaswinkler/paperless-ng/tags
|
||||
method: GET
|
||||
register: tags
|
||||
- name: get commit for target tag
|
||||
set_fact:
|
||||
paperlessng_commit: "{{ tags.json | json_query('[?name==`ng-' + paperlessng_version +'`] | [0].commit.sha') }}"
|
||||
when: paperlessng_version | regex_search("^(ng-)?(\d+\.\d+\.\d+)$")
|
||||
|
||||
- block:
|
||||
- name: check if version is branch
|
||||
uri:
|
||||
url: "https://api.github.com/repos/jonaswinkler/paperless-ng/branches/{{ paperlessng_version }}"
|
||||
method: GET
|
||||
status_code: [200, 404]
|
||||
register: branch
|
||||
- name: get commit for target branch
|
||||
set_fact:
|
||||
paperlessng_commit: "{{ branch.json | json_query('commit.sha') }}"
|
||||
when: branch.status == 200
|
||||
- block:
|
||||
- name: check if version is commit-or-ref
|
||||
uri:
|
||||
url: "https://api.github.com/repos/jonaswinkler/paperless-ng/commits/{{ paperlessng_version }}"
|
||||
method: GET
|
||||
status_code: [200, 404, 422]
|
||||
register: commit
|
||||
- name: get commit for target commit-or-ref
|
||||
set_fact:
|
||||
paperlessng_commit: "{{ commit.json | json_query('sha') }}"
|
||||
when: commit.status == 200
|
||||
- name: fail
|
||||
fail:
|
||||
msg: "Can not determine commit from `paperlessng_version=={{ paperlessng_version }}`!"
|
||||
when: commit.status != 200
|
||||
when: branch.status == 404
|
||||
when: not(paperlessng_version | regex_search("^(ng-)?(\d+\.\d+\.\d+)$"))
|
||||
|
||||
- name: check for paperless-ng installation
|
||||
command:
|
||||
cmd: "cat {{ paperlessng_directory }}/.installed_version"
|
||||
changed_when: '"No such file or directory" in paperlessng_current_commit.stderr or paperlessng_current_commit.stdout != paperlessng_commit | string'
|
||||
failed_when: false
|
||||
ignore_errors: yes
|
||||
register: paperlessng_current_commit
|
||||
|
||||
- name: register current state
|
||||
set_fact:
|
||||
fresh_installation: '{{ "No such file or directory" in paperlessng_current_commit.stderr }}'
|
||||
update_installation: '{{ "No such file or directory" not in paperlessng_current_commit.stderr and paperlessng_current_commit.stdout != paperlessng_commit | string }}'
|
||||
reconfigure_only: "{{ paperlessng_current_commit.stdout == paperlessng_commit | string }}"
|
||||
|
||||
- block:
|
||||
- name: backup current paperless-ng installation
|
||||
copy:
|
||||
src: "{{ paperlessng_directory }}"
|
||||
remote_src: yes
|
||||
dest: "{{ paperlessng_directory }}-{{ ansible_date_time.iso8601 }}/"
|
||||
- name: remove current paperless sources
|
||||
file:
|
||||
path: "{{ paperlessng_directory }}/{{ item }}"
|
||||
state: absent
|
||||
with_items:
|
||||
- docker
|
||||
- docs
|
||||
- scripts
|
||||
- src
|
||||
- static
|
||||
when: update_installation
|
||||
|
||||
- block:
|
||||
- name: create paperless-ng directory and set permissions
|
||||
file:
|
||||
path: "{{ paperlessng_directory }}"
|
||||
state: directory
|
||||
owner: "{{ paperlessng_system_user }}"
|
||||
group: "{{ paperlessng_system_group }}"
|
||||
mode: "750"
|
||||
- name: create temporary directory
|
||||
become: yes
|
||||
become_user: "{{ paperlessng_system_user }}"
|
||||
tempfile:
|
||||
state: directory
|
||||
path: "{{ paperlessng_directory }}"
|
||||
register: tempdir
|
||||
- name: check if version is available as release archive
|
||||
uri:
|
||||
url: "https://github.com/jonaswinkler/paperless-ng/releases/download/ng-{{ paperlessng_version }}/paperless-ng-{{ paperlessng_version }}.tar.xz"
|
||||
method: GET
|
||||
status_code: [200, 404]
|
||||
register: release_archive
|
||||
- name: install paperless-ng from source
|
||||
include_tasks: install-source.yml
|
||||
when: release_archive.status == 404
|
||||
- name: install paperless-ng from release archive
|
||||
include_tasks: install-release.yml
|
||||
when: release_archive.status == 200
|
||||
- name: change owner and permissions of paperless-ng
|
||||
command:
|
||||
cmd: "{{ item }}"
|
||||
warn: false
|
||||
with_items:
|
||||
- "chown -R {{ paperlessng_system_user }}:{{ paperlessng_system_group }} {{ tempdir.path }}"
|
||||
- "find {{ tempdir.path }} -type d -exec chmod 0750 {} ;"
|
||||
- "find {{ tempdir.path }} -type f -exec chmod 0640 {} ;"
|
||||
- name: move paperless-ng
|
||||
command:
|
||||
cmd: "cp -a {{ tempdir.path }}/paperless-ng/. {{ paperlessng_directory }}"
|
||||
- name: store commit hash of installed version
|
||||
copy:
|
||||
content: "{{ paperlessng_commit }}"
|
||||
dest: "{{ paperlessng_directory }}/.installed_version"
|
||||
owner: "{{ paperlessng_system_user }}"
|
||||
group: "{{ paperlessng_system_group }}"
|
||||
mode: "0440"
|
||||
- name: remove temporary directory
|
||||
file:
|
||||
path: "{{ tempdir.path }}"
|
||||
state: absent
|
||||
when: not reconfigure_only
|
||||
|
||||
- name: create paperless-ng directories and set permissions
|
||||
file:
|
||||
path: "{{ item }}"
|
||||
state: directory
|
||||
owner: "{{ paperlessng_system_user }}"
|
||||
group: "{{ paperlessng_system_group }}"
|
||||
mode: "750"
|
||||
with_items:
|
||||
- "{{ paperlessng_consumption_dir }}"
|
||||
- "{{ paperlessng_data_dir }}"
|
||||
- "{{ paperlessng_media_root }}"
|
||||
- "{{ paperlessng_staticdir }}"
|
||||
|
||||
- name: rename initial config
|
||||
command:
|
||||
cmd: "mv -f {{ paperlessng_directory }}/paperless.conf {{ paperlessng_directory }}/paperless.conf.template"
|
||||
removes: "{{ paperlessng_directory }}/paperless.conf"
|
||||
|
||||
- name: configure paperless-ng
|
||||
lineinfile:
|
||||
path: "{{ paperlessng_directory }}/paperless.conf.template"
|
||||
regexp: "^#?{{ item.regexp }}="
|
||||
line: "{{ item.line }}"
|
||||
with_items:
|
||||
# Required services
|
||||
- regexp: PAPERLESS_REDIS
|
||||
line: "PAPERLESS_REDIS=redis://{{ paperlessng_redis_host }}:{{ paperlessng_redis_port }}"
|
||||
# Paths and folders
|
||||
- regexp: PAPERLESS_CONSUMPTION_DIR
|
||||
line: "PAPERLESS_CONSUMPTION_DIR={{ paperlessng_consumption_dir }}"
|
||||
- regexp: PAPERLESS_DATA_DIR
|
||||
line: "PAPERLESS_DATA_DIR={{ paperlessng_data_dir }}"
|
||||
- regexp: PAPERLESS_MEDIA_ROOT
|
||||
line: "PAPERLESS_MEDIA_ROOT={{ paperlessng_media_root }}"
|
||||
- regexp: PAPERLESS_STATICDIR
|
||||
line: "PAPERLESS_STATICDIR={{ paperlessng_staticdir }}"
|
||||
- regexp: PAPERLESS_FILENAME_FORMAT
|
||||
line: "PAPERLESS_FILENAME_FORMAT={{ paperlessng_filename_format }}"
|
||||
- regexp: PAPERLESS_LOGGING_DIR
|
||||
line: "PAPERLESS_LOGGING_DIR={{ paperlessng_logging_dir }}"
|
||||
# Hosting & Security
|
||||
- regexp: PAPERLESS_SECRET_KEY
|
||||
line: "PAPERLESS_SECRET_KEY={{ paperlessng_secret_key }}"
|
||||
- regexp: PAPERLESS_ALLOWED_HOSTS
|
||||
line: "PAPERLESS_ALLOWED_HOSTS={{ paperlessng_allowed_hosts }}"
|
||||
- regexp: PAPERLESS_CORS_ALLOWED_HOSTS
|
||||
line: "PAPERLESS_CORS_ALLOWED_HOSTS={{ paperlessng_cors_allowed_hosts }}"
|
||||
- regexp: PAPERLESS_FORCE_SCRIPT_NAME
|
||||
line: "PAPERLESS_FORCE_SCRIPT_NAME={{ paperlessng_force_script_name }}"
|
||||
- regexp: PAPERLESS_STATIC_URL
|
||||
line: "PAPERLESS_STATIC_URL={{ paperlessng_static_url }}"
|
||||
- regexp: PAPERLESS_AUTO_LOGIN_USERNAME
|
||||
line: "PAPERLESS_AUTO_LOGIN_USERNAME={{ paperlessng_auto_login_username }}"
|
||||
- regexp: PAPERLESS_COOKIE_PREFIX
|
||||
line: "PAPERLESS_COOKIE_PREFIX={{ paperlessng_cookie_prefix }}"
|
||||
- regexp: PAPERLESS_ENABLE_HTTP_REMOTE_USER
|
||||
line: "PAPERLESS_ENABLE_HTTP_REMOTE_USER={{ paperlessng_enable_http_remote_user }}"
|
||||
# OCR settings
|
||||
- regexp: PAPERLESS_OCR_LANGUAGE
|
||||
line: "PAPERLESS_OCR_LANGUAGE={{ paperlessng_ocr_languages | join('+') | replace('-','_') }}"
|
||||
- regexp: PAPERLESS_OCR_MODE
|
||||
line: "PAPERLESS_OCR_MODE={{ paperlessng_ocr_mode }}"
|
||||
- regexp: PAPERLESS_OCR_CLEAN
|
||||
line: "PAPERLESS_OCR_CLEAN={{ paperlessng_ocr_clean }}"
|
||||
- regexp: PAPERLESS_OCR_DESKEW
|
||||
line: "PAPERLESS_OCR_DESKEW={{ paperlessng_ocr_deskew }}"
|
||||
- regexp: PAPERLESS_OCR_ROTATE_PAGES
|
||||
line: "PAPERLESS_OCR_ROTATE_PAGES={{ paperlessng_ocr_rotate_pages }}"
|
||||
- regexp: PAPERLESS_OCR_ROTATE_PAGES_THRESHOLD
|
||||
line: "PAPERLESS_OCR_ROTATE_PAGES_THRESHOLD={{ paperlessng_ocr_rotate_pages_threshold }}"
|
||||
- regexp: PAPERLESS_OCR_OUTPUT_TYPE
|
||||
line: "PAPERLESS_OCR_OUTPUT_TYPE={{ paperlessng_ocr_output_type }}"
|
||||
- regexp: PAPERLESS_OCR_PAGES
|
||||
line: "PAPERLESS_OCR_PAGES={{ paperlessng_ocr_pages }}"
|
||||
- regexp: PAPERLESS_OCR_IMAGE_DPI
|
||||
line: "PAPERLESS_OCR_IMAGE_DPI={{ paperlessng_ocr_image_dpi }}"
|
||||
- regexp: PAPERLESS_OCR_USER_ARGS
|
||||
line: "PAPERLESS_OCR_USER_ARGS={{ paperlessng_ocr_user_args | combine({'jbig2_lossy': true} if paperlessng_big2enc_lossy else {}) | to_json }}"
|
||||
# Tika settings
|
||||
- regexp: PAPERLESS_TIKA_ENABLED
|
||||
line: "PAPERLESS_TIKA_ENABLED={{ paperlessng_tika_enabled }}"
|
||||
- regexp: PAPERLESS_TIKA_ENDPOINT
|
||||
line: "PAPERLESS_TIKA_ENDPOINT={{ paperlessng_tika_endpoint }}"
|
||||
- regexp: PAPERLESS_TIKA_GOTENBERG_ENDPOINT
|
||||
line: "PAPERLESS_TIKA_GOTENBERG_ENDPOINT={{ paperlessng_tika_endpoint }}"
|
||||
# Software tweaks
|
||||
- regexp: PAPERLESS_TIME_ZONE
|
||||
line: "PAPERLESS_TIME_ZONE={{ paperlessng_time_zone }}"
|
||||
- regexp: PAPERLESS_CONSUMER_POLLING
|
||||
line: "PAPERLESS_CONSUMER_POLLING={{ paperlessng_consumer_polling }}"
|
||||
- regexp: PAPERLESS_CONSUMER_DELETE_DUPLICATES
|
||||
line: "PAPERLESS_CONSUMER_DELETE_DUPLICATES={{ paperlessng_consumer_delete_duplicates }}"
|
||||
- regexp: PAPERLESS_CONSUMER_RECURSIVE
|
||||
line: "PAPERLESS_CONSUMER_RECURSIVE={{ paperlessng_consumer_recursive }}"
|
||||
- regexp: PAPERLESS_CONSUMER_SUBDIRS_AS_TAGS
|
||||
line: "PAPERLESS_CONSUMER_SUBDIRS_AS_TAGS={{ paperlessng_consumer_subdirs_as_tags }}"
|
||||
- regexp: PAPERLESS_CONVERT_MEMORY_LIMIT
|
||||
line: "PAPERLESS_CONVERT_MEMORY_LIMIT={{ paperlessng_convert_memory_limit }}"
|
||||
- regexp: PAPERLESS_CONVERT_TMPDIR
|
||||
line: "PAPERLESS_CONVERT_TMPDIR={{ paperlessng_convert_tmpdir }}"
|
||||
- regexp: PAPERLESS_OPTIMIZE_THUMBNAILS
|
||||
line: "PAPERLESS_OPTIMIZE_THUMBNAILS={{ paperlessng_optimize_thumbnails }}"
|
||||
- regexp: PAPERLESS_POST_CONSUME_SCRIPT
|
||||
line: "PAPERLESS_POST_CONSUME_SCRIPT={{ paperlessng_post_consume_script }}"
|
||||
- regexp: PAPERLESS_FILENAME_DATE_ORDER
|
||||
line: "PAPERLESS_FILENAME_DATE_ORDER={{ paperlessng_filename_date_order }}"
|
||||
- regexp: PAPERLESS_THUMBNAIL_FONT_NAME
|
||||
line: "PAPERLESS_THUMBNAIL_FONT_NAME={{ paperlessng_thumbnail_font_name }}"
|
||||
- regexp: PAPERLESS_IGNORE_DATES
|
||||
line: "PAPERLESS_IGNORE_DATES={{ paperlessng_ignore_dates }}"
|
||||
no_log: yes
|
||||
|
||||
- name: configure paperless-ng database [sqlite]
|
||||
lineinfile:
|
||||
path: "{{ paperlessng_directory }}/paperless.conf.template"
|
||||
regexp: "^#?PAPERLESS_DBHOST=(.*)$"
|
||||
line: '#PAPERLESS_DBHOST=\1'
|
||||
backrefs: yes
|
||||
when: paperlessng_db_type == 'sqlite'
|
||||
|
||||
- name: configure paperless-ng database [postgresql]
|
||||
lineinfile:
|
||||
path: "{{ paperlessng_directory }}/paperless.conf.template"
|
||||
regexp: "^#?{{ item.regexp }}="
|
||||
line: "{{ item.line }}"
|
||||
with_items:
|
||||
- regexp: PAPERLESS_DBHOST
|
||||
line: "PAPERLESS_DBHOST={{ paperlessng_db_host }}"
|
||||
- regexp: PAPERLESS_DBPORT
|
||||
line: "PAPERLESS_DBPORT={{ paperlessng_db_port }}"
|
||||
- regexp: PAPERLESS_DBNAME
|
||||
line: "PAPERLESS_DBNAME={{ paperlessng_db_name }}"
|
||||
- regexp: PAPERLESS_DBUSER
|
||||
line: "PAPERLESS_DBUSER={{ paperlessng_db_user }}"
|
||||
- regexp: PAPERLESS_DBPASS
|
||||
line: "PAPERLESS_DBPASS={{ paperlessng_db_pass }}"
|
||||
- regexp: PAPERLESS_DBSSLMODE
|
||||
line: "PAPERLESS_DBSSLMODE={{ paperlessng_db_sslmode }}"
|
||||
when: paperlessng_db_type == 'postgresql'
|
||||
no_log: yes
|
||||
|
||||
- name: deploy paperless-ng configuration
|
||||
copy:
|
||||
src: "{{ paperlessng_directory }}/paperless.conf.template"
|
||||
remote_src: yes
|
||||
dest: /etc/paperless.conf
|
||||
owner: root
|
||||
group: root
|
||||
mode: "0644"
|
||||
register: configuration
|
||||
|
||||
- name: create paperlessng venv
|
||||
become: yes
|
||||
become_user: "{{ paperlessng_system_user }}"
|
||||
command:
|
||||
cmd: "python3 -m virtualenv {{ paperlessng_virtualenv }} -p /usr/bin/python3"
|
||||
creates: "{{ paperlessng_virtualenv }}"
|
||||
register: venv
|
||||
|
||||
- block:
|
||||
- name: install paperlessng requirements
|
||||
become: yes
|
||||
become_user: "{{ paperlessng_system_user }}"
|
||||
pip:
|
||||
requirements: "{{ paperlessng_directory }}/requirements.txt"
|
||||
executable: "{{ paperlessng_virtualenv }}/bin/pip3"
|
||||
extra_args: --upgrade
|
||||
- name: migrate database schema
|
||||
become: yes
|
||||
become_user: "{{ paperlessng_system_user }}"
|
||||
command: "{{ paperlessng_virtualenv }}/bin/python3 {{ paperlessng_directory }}/src/manage.py migrate"
|
||||
register: database_schema
|
||||
changed_when: '"No migrations to apply." not in database_schema.stdout'
|
||||
when: not reconfigure_only
|
||||
|
||||
- name: configure paperless superuser
|
||||
become: yes
|
||||
become_user: "{{ paperlessng_system_user }}"
|
||||
# "manage.py createsuperuser" only works on interactive TTYs
|
||||
vars:
|
||||
creation_script: |
|
||||
from django.contrib.auth.models import User
|
||||
from django.contrib.auth.hashers import get_hasher
|
||||
|
||||
if User.objects.filter(username='{{ paperlessng_superuser_name }}').exists():
|
||||
user = User.objects.get(username='{{ paperlessng_superuser_name }}')
|
||||
old = user.__dict__.copy()
|
||||
|
||||
user.is_superuser = True
|
||||
user.email = '{{ paperlessng_superuser_email }}'
|
||||
user.set_password('{{ paperlessng_superuser_password }}')
|
||||
user.save()
|
||||
new = user.__dict__
|
||||
|
||||
algorithm, iterations, old_salt, old_hash = old['password'].split('$')
|
||||
new_password_old_salt = get_hasher(algorithm).encode(password='{{ paperlessng_superuser_password }}', salt=old_salt, iterations=int(iterations))
|
||||
_, _, _, new_hash = new_password_old_salt.split('$')
|
||||
if not (old_hash == new_hash and old['is_superuser'] == new['is_superuser'] and old['email'] == new['email']):
|
||||
print('changed')
|
||||
else:
|
||||
User.objects.create_superuser('{{ paperlessng_superuser_name }}', '{{ paperlessng_superuser_email }}', '{{ paperlessng_superuser_password }}')
|
||||
print('changed')
|
||||
command: '{{ paperlessng_virtualenv }}/bin/python3 {{ paperlessng_directory }}/src/manage.py shell -c "{{ creation_script }}"'
|
||||
register: superuser
|
||||
changed_when: superuser.stdout == 'changed'
|
||||
no_log: yes
|
||||
|
||||
- name: set ownership and permissions on paperlessng venv
|
||||
file:
|
||||
path: "{{ paperlessng_virtualenv }}"
|
||||
state: directory
|
||||
recurse: yes
|
||||
owner: "{{ paperlessng_system_user }}"
|
||||
group: "{{ paperlessng_system_group }}"
|
||||
mode: g-w,o-rwx
|
||||
when: venv.changed or not reconfigure_only
|
||||
|
||||
- name: configure ghostscript for PDF
|
||||
lineinfile:
|
||||
path: /etc/ImageMagick-6/policy.xml
|
||||
regexp: '(\s+)<policy domain="coder" rights=".*" pattern="PDF" />'
|
||||
line: '\1<policy domain="coder" rights="read|write" pattern="PDF" />'
|
||||
backrefs: yes
|
||||
|
||||
- name: configure gunicorn web server
|
||||
lineinfile:
|
||||
path: "{{ paperlessng_directory }}/gunicorn.conf.py"
|
||||
regexp: '^bind = '
|
||||
line: "bind = '{{ paperlessng_listen_address }}:{{ paperlessng_listen_port }}'"
|
||||
|
||||
- name: configure systemd services
|
||||
ini_file:
|
||||
path: "{{ paperlessng_directory }}/scripts/{{ item[0] }}"
|
||||
section: "Service"
|
||||
option: "{{ item[1].option }}"
|
||||
value: "{{ item[1].value }}"
|
||||
with_nested:
|
||||
- [
|
||||
paperless-consumer.service,
|
||||
paperless-scheduler.service,
|
||||
paperless-webserver.service,
|
||||
]
|
||||
- [
|
||||
# https://www.freedesktop.org/software/systemd/man/systemd.exec.html
|
||||
{ option: "User", value: "{{ paperlessng_system_user }}" },
|
||||
{ option: "Group", value: "{{ paperlessng_system_group }}" },
|
||||
{ option: "WorkingDirectory", value: "{{ paperlessng_directory }}/src" },
|
||||
{ option: "ProtectSystem", value: "full" },
|
||||
{ option: "NoNewPrivileges", value: "true" },
|
||||
{ option: "PrivateUsers", value: "true" },
|
||||
{ option: "PrivateDevices", value: "true" },
|
||||
]
|
||||
|
||||
- name: configure paperless-consumer service
|
||||
ini_file:
|
||||
path: "{{ paperlessng_directory }}/scripts/paperless-consumer.service"
|
||||
section: "Service"
|
||||
option: "ExecStart"
|
||||
value: "{{ paperlessng_virtualenv }}/bin/python3 manage.py document_consumer"
|
||||
|
||||
- name: configure paperless-scheduler service
|
||||
ini_file:
|
||||
path: "{{ paperlessng_directory }}/scripts/paperless-scheduler.service"
|
||||
section: "Service"
|
||||
option: "ExecStart"
|
||||
value: "{{ paperlessng_virtualenv }}/bin/python3 manage.py qcluster"
|
||||
|
||||
- name: configure paperless-webserver service
|
||||
ini_file:
|
||||
path: "{{ paperlessng_directory }}/scripts/paperless-webserver.service"
|
||||
section: "Service"
|
||||
option: "ExecStart"
|
||||
value: "{{ paperlessng_virtualenv }}/bin/gunicorn -c {{ paperlessng_directory }}/gunicorn.conf.py paperless.asgi:application"
|
||||
|
||||
- name: copy systemd services
|
||||
copy:
|
||||
src: "{{ paperlessng_directory }}/scripts/{{ item }}"
|
||||
remote_src: yes
|
||||
dest: "/etc/systemd/system/{{ item }}"
|
||||
with_items:
|
||||
- paperless-consumer.service
|
||||
- paperless-scheduler.service
|
||||
- paperless-webserver.service
|
||||
register: paperless_services
|
||||
|
||||
- name: reload systemd daemon
|
||||
systemd:
|
||||
name: "{{ item }}"
|
||||
state: restarted
|
||||
daemon_reload: yes
|
||||
with_items:
|
||||
- paperless-consumer
|
||||
- paperless-scheduler
|
||||
- paperless-webserver
|
||||
when: paperless_services.changed or configuration.changed
|
||||
|
||||
- name: enable paperlessng services
|
||||
systemd:
|
||||
name: "{{ item }}"
|
||||
enabled: yes
|
||||
masked: no
|
||||
state: started
|
||||
with_items:
|
||||
- paperless-consumer
|
||||
- paperless-scheduler
|
||||
- paperless-webserver
|
43
build-docker-image.sh
Executable file
@@ -0,0 +1,43 @@
|
||||
#!/usr/bin/env bash
|
||||
|
||||
# Helper script for building the Docker image locally.
|
||||
# Parses and provides the nessecary versions of other images to Docker
|
||||
# before passing in the rest of script args.
|
||||
|
||||
# First Argument: The Dockerfile to build
|
||||
# Other Arguments: Additional arguments to docker build
|
||||
|
||||
# Example Usage:
|
||||
# ./build-docker-image.sh Dockerfile -t paperless-ngx:my-awesome-feature
|
||||
|
||||
set -eux
|
||||
|
||||
if ! command -v jq; then
|
||||
echo "jq required"
|
||||
exit 1
|
||||
elif [ ! -f "$1" ]; then
|
||||
echo "$1 is not a file, please provide the Dockerfile"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Parse what we can from Pipfile.lock
|
||||
pikepdf_version=$(jq ".default.pikepdf.version" Pipfile.lock | sed 's/=//g' | sed 's/"//g')
|
||||
psycopg2_version=$(jq ".default.psycopg2.version" Pipfile.lock | sed 's/=//g' | sed 's/"//g')
|
||||
# Read this from the other config file
|
||||
qpdf_version=$(jq ".qpdf.version" .build-config.json | sed 's/"//g')
|
||||
jbig2enc_version=$(jq ".jbig2enc.version" .build-config.json | sed 's/"//g')
|
||||
# Get the branch name (used for caching)
|
||||
branch_name=$(git rev-parse --abbrev-ref HEAD)
|
||||
|
||||
# https://docs.docker.com/develop/develop-images/build_enhancements/
|
||||
# Required to use cache-from
|
||||
export DOCKER_BUILDKIT=1
|
||||
|
||||
docker build --file "$1" \
|
||||
--progress=plain \
|
||||
--cache-from ghcr.io/paperless-ngx/paperless-ngx/builder/cache/app:"${branch_name}" \
|
||||
--cache-from ghcr.io/paperless-ngx/paperless-ngx/builder/cache/app:dev \
|
||||
--build-arg JBIG2ENC_VERSION="${jbig2enc_version}" \
|
||||
--build-arg QPDF_VERSION="${qpdf_version}" \
|
||||
--build-arg PIKEPDF_VERSION="${pikepdf_version}" \
|
||||
--build-arg PSYCOPG2_VERSION="${psycopg2_version}" "${@:2}" .
|
@@ -1,7 +0,0 @@
|
||||
#!/bin/bash
|
||||
|
||||
set -e
|
||||
|
||||
cd src-ui
|
||||
npm install
|
||||
./node_modules/.bin/ng build --prod
|
14
docker-builders/Dockerfile.frontend
Normal file
@@ -0,0 +1,14 @@
|
||||
# This Dockerfile compiles the frontend
|
||||
# Inputs: None
|
||||
|
||||
FROM node:16-bullseye-slim AS compile-frontend
|
||||
|
||||
COPY ./src /src/src
|
||||
COPY ./src-ui /src/src-ui
|
||||
|
||||
WORKDIR /src/src-ui
|
||||
RUN set -eux \
|
||||
&& npm update npm -g \
|
||||
&& npm ci --omit=optional
|
||||
RUN set -eux \
|
||||
&& ./node_modules/.bin/ng build --configuration production
|
35
docker-builders/Dockerfile.jbig2enc
Normal file
@@ -0,0 +1,35 @@
|
||||
# This Dockerfile compiles the jbig2enc library
|
||||
# Inputs:
|
||||
# - JBIG2ENC_VERSION - the Git tag to checkout and build
|
||||
|
||||
FROM debian:bullseye-slim as main
|
||||
|
||||
LABEL org.opencontainers.image.description="A intermediate image with jbig2enc built"
|
||||
|
||||
ARG DEBIAN_FRONTEND=noninteractive
|
||||
ARG JBIG2ENC_VERSION
|
||||
|
||||
ARG BUILD_PACKAGES="\
|
||||
build-essential \
|
||||
automake \
|
||||
libtool \
|
||||
libleptonica-dev \
|
||||
zlib1g-dev \
|
||||
git \
|
||||
ca-certificates"
|
||||
|
||||
WORKDIR /usr/src/jbig2enc
|
||||
|
||||
RUN set -eux \
|
||||
&& echo "Installing build tools" \
|
||||
&& apt-get update --quiet \
|
||||
&& apt-get install --yes --quiet --no-install-recommends ${BUILD_PACKAGES} \
|
||||
&& echo "Building jbig2enc" \
|
||||
&& git clone --quiet --branch $JBIG2ENC_VERSION https://github.com/agl/jbig2enc . \
|
||||
&& ./autogen.sh \
|
||||
&& ./configure \
|
||||
&& make \
|
||||
&& echo "Cleaning up image" \
|
||||
&& apt-get -y purge ${BUILD_PACKAGES} \
|
||||
&& apt-get -y autoremove --purge \
|
||||
&& rm -rf /var/lib/apt/lists/*
|
88
docker-builders/Dockerfile.pikepdf
Normal file
@@ -0,0 +1,88 @@
|
||||
# This Dockerfile builds the pikepdf wheel
|
||||
# Inputs:
|
||||
# - REPO - Docker repository to pull qpdf from
|
||||
# - QPDF_VERSION - The image qpdf version to copy .deb files from
|
||||
# - PIKEPDF_VERSION - Version of pikepdf to build wheel for
|
||||
|
||||
# Default to pulling from the main repo registry when manually building
|
||||
ARG REPO="paperless-ngx/paperless-ngx"
|
||||
|
||||
ARG QPDF_VERSION
|
||||
FROM ghcr.io/${REPO}/builder/qpdf:${QPDF_VERSION} as qpdf-builder
|
||||
|
||||
# This does nothing, except provide a name for a copy below
|
||||
|
||||
FROM python:3.9-slim-bullseye as main
|
||||
|
||||
LABEL org.opencontainers.image.description="A intermediate image with pikepdf wheel built"
|
||||
|
||||
ARG DEBIAN_FRONTEND=noninteractive
|
||||
ARG PIKEPDF_VERSION
|
||||
|
||||
ARG BUILD_PACKAGES="\
|
||||
build-essential \
|
||||
python3-dev \
|
||||
python3-pip \
|
||||
# qpdf requirement - https://github.com/qpdf/qpdf#crypto-providers
|
||||
libgnutls28-dev \
|
||||
# lxml requrements - https://lxml.de/installation.html
|
||||
libxml2-dev \
|
||||
libxslt1-dev \
|
||||
# Pillow requirements - https://pillow.readthedocs.io/en/stable/installation.html#external-libraries
|
||||
# JPEG functionality
|
||||
libjpeg62-turbo-dev \
|
||||
# conpressed PNG
|
||||
zlib1g-dev \
|
||||
# compressed TIFF
|
||||
libtiff-dev \
|
||||
# type related services
|
||||
libfreetype-dev \
|
||||
# color management
|
||||
liblcms2-dev \
|
||||
# WebP format
|
||||
libwebp-dev \
|
||||
# JPEG 2000
|
||||
libopenjp2-7-dev \
|
||||
# improved color quantization
|
||||
libimagequant-dev \
|
||||
# complex text layout support
|
||||
libraqm-dev"
|
||||
|
||||
WORKDIR /usr/src
|
||||
|
||||
COPY --from=qpdf-builder /usr/src/qpdf/*.deb ./
|
||||
|
||||
# As this is an base image for a multi-stage final image
|
||||
# the added size of the install is basically irrelevant
|
||||
|
||||
RUN set -eux \
|
||||
&& echo "Installing build tools" \
|
||||
&& apt-get update --quiet \
|
||||
&& apt-get install --yes --quiet --no-install-recommends ${BUILD_PACKAGES} \
|
||||
&& echo "Installing qpdf" \
|
||||
&& dpkg --install libqpdf28_*.deb \
|
||||
&& dpkg --install libqpdf-dev_*.deb \
|
||||
&& echo "Installing Python tools" \
|
||||
&& python3 -m pip install --no-cache-dir --upgrade \
|
||||
pip \
|
||||
wheel \
|
||||
# https://pikepdf.readthedocs.io/en/latest/installation.html#requirements
|
||||
pybind11 \
|
||||
&& echo "Building pikepdf wheel ${PIKEPDF_VERSION}" \
|
||||
&& mkdir wheels \
|
||||
&& python3 -m pip wheel \
|
||||
# Build the package at the required version
|
||||
pikepdf==${PIKEPDF_VERSION} \
|
||||
# Output the *.whl into this directory
|
||||
--wheel-dir wheels \
|
||||
# Do not use a binary packge for the package being built
|
||||
--no-binary=pikepdf \
|
||||
# Do use binary packages for dependencies
|
||||
--prefer-binary \
|
||||
# Don't cache build files
|
||||
--no-cache-dir \
|
||||
&& ls -ahl wheels \
|
||||
&& echo "Cleaning up image" \
|
||||
&& apt-get -y purge ${BUILD_PACKAGES} \
|
||||
&& apt-get -y autoremove --purge \
|
||||
&& rm -rf /var/lib/apt/lists/*
|
48
docker-builders/Dockerfile.psycopg2
Normal file
@@ -0,0 +1,48 @@
|
||||
# This Dockerfile builds the psycopg2 wheel
|
||||
# Inputs:
|
||||
# - PSYCOPG2_VERSION - Version to build
|
||||
|
||||
FROM python:3.9-slim-bullseye as main
|
||||
|
||||
LABEL org.opencontainers.image.description="A intermediate image with psycopg2 wheel built"
|
||||
|
||||
ARG PSYCOPG2_VERSION
|
||||
ARG DEBIAN_FRONTEND=noninteractive
|
||||
|
||||
ARG BUILD_PACKAGES="\
|
||||
build-essential \
|
||||
python3-dev \
|
||||
python3-pip \
|
||||
# https://www.psycopg.org/docs/install.html#prerequisites
|
||||
libpq-dev"
|
||||
|
||||
WORKDIR /usr/src
|
||||
|
||||
# As this is an base image for a multi-stage final image
|
||||
# the added size of the install is basically irrelevant
|
||||
|
||||
RUN set -eux \
|
||||
&& echo "Installing build tools" \
|
||||
&& apt-get update --quiet \
|
||||
&& apt-get install --yes --quiet --no-install-recommends ${BUILD_PACKAGES} \
|
||||
&& echo "Installing Python tools" \
|
||||
&& python3 -m pip install --no-cache-dir --upgrade pip wheel \
|
||||
&& echo "Building psycopg2 wheel ${PSYCOPG2_VERSION}" \
|
||||
&& cd /usr/src \
|
||||
&& mkdir wheels \
|
||||
&& python3 -m pip wheel \
|
||||
# Build the package at the required version
|
||||
psycopg2==${PSYCOPG2_VERSION} \
|
||||
# Output the *.whl into this directory
|
||||
--wheel-dir wheels \
|
||||
# Do not use a binary packge for the package being built
|
||||
--no-binary=psycopg2 \
|
||||
# Do use binary packages for dependencies
|
||||
--prefer-binary \
|
||||
# Don't cache build files
|
||||
--no-cache-dir \
|
||||
&& ls -ahl wheels/ \
|
||||
&& echo "Cleaning up image" \
|
||||
&& apt-get -y purge ${BUILD_PACKAGES} \
|
||||
&& apt-get -y autoremove --purge \
|
||||
&& rm -rf /var/lib/apt/lists/*
|
52
docker-builders/Dockerfile.qpdf
Normal file
@@ -0,0 +1,52 @@
|
||||
# This Dockerfile compiles the jbig2enc library
|
||||
# Inputs:
|
||||
# - QPDF_VERSION - the version of qpdf to build a .deb.
|
||||
# Must be preset as a deb-src
|
||||
|
||||
FROM debian:bullseye-slim as main
|
||||
|
||||
LABEL org.opencontainers.image.description="A intermediate image with qpdf built"
|
||||
|
||||
ARG DEBIAN_FRONTEND=noninteractive
|
||||
# This must match to pikepdf's minimum at least
|
||||
ARG QPDF_VERSION
|
||||
|
||||
ARG BUILD_PACKAGES="\
|
||||
build-essential \
|
||||
debhelper \
|
||||
debian-keyring \
|
||||
devscripts \
|
||||
equivs \
|
||||
libtool \
|
||||
# https://qpdf.readthedocs.io/en/stable/installation.html#system-requirements
|
||||
libjpeg62-turbo-dev \
|
||||
libgnutls28-dev \
|
||||
packaging-dev \
|
||||
zlib1g-dev"
|
||||
|
||||
WORKDIR /usr/src
|
||||
|
||||
# As this is an base image for a multi-stage final image
|
||||
# the added size of the install is basically irrelevant
|
||||
|
||||
RUN set -eux \
|
||||
&& echo "Installing build tools" \
|
||||
&& apt-get update --quiet \
|
||||
&& apt-get install --yes --quiet --no-install-recommends $BUILD_PACKAGES \
|
||||
&& echo "Building qpdf" \
|
||||
&& echo "deb-src http://deb.debian.org/debian/ bookworm main" > /etc/apt/sources.list.d/bookworm-src.list \
|
||||
&& apt-get update \
|
||||
&& mkdir qpdf \
|
||||
&& cd qpdf \
|
||||
&& apt-get source --yes --quiet qpdf=${QPDF_VERSION}-1/bookworm \
|
||||
&& cd qpdf-$QPDF_VERSION \
|
||||
# We don't need to build the tests (also don't run them)
|
||||
&& rm -rf libtests \
|
||||
&& DEBEMAIL=hello@paperless-ngx.com debchange --bpo \
|
||||
&& export DEB_BUILD_OPTIONS="terse nocheck nodoc parallel=2" \
|
||||
&& dpkg-buildpackage --build=binary --unsigned-source --unsigned-changes --post-clean \
|
||||
&& ls -ahl ../*.deb \
|
||||
&& echo "Cleaning up image" \
|
||||
&& apt-get -y purge ${BUILD_PACKAGES} \
|
||||
&& apt-get -y autoremove --purge \
|
||||
&& rm -rf /var/lib/apt/lists/*
|
@@ -22,6 +22,10 @@
|
||||
# Docker setup does not use the configuration file.
|
||||
# A few commonly adjusted settings are provided below.
|
||||
|
||||
# This is required if you will be exposing Paperless-ngx on a public domain
|
||||
# (if doing so please consider security measures such as reverse proxy)
|
||||
#PAPERLESS_URL=https://paperless.example.com
|
||||
|
||||
# Adjust this key if you plan to make paperless available publicly. It should
|
||||
# be a very long sequence of random characters. You don't need to remember it.
|
||||
#PAPERLESS_SECRET_KEY=change-me
|
||||
@@ -32,3 +36,7 @@
|
||||
# The default language to use for OCR. Set this to the language most of your
|
||||
# documents are written in.
|
||||
#PAPERLESS_OCR_LANGUAGE=eng
|
||||
|
||||
# Set if accessing paperless via a domain subpath e.g. https://domain.com/PATHPREFIX and using a reverse-proxy like traefik or nginx
|
||||
#PAPERLESS_FORCE_SCRIPT_NAME=/PATHPREFIX
|
||||
#PAPERLESS_STATIC_URL=/PATHPREFIX/static/ # trailing slash required
|
||||
|
101
docker/compose/docker-compose.mariadb-tika.yml
Normal file
@@ -0,0 +1,101 @@
|
||||
# docker-compose file for running paperless from the Docker Hub.
|
||||
# This file contains everything paperless needs to run.
|
||||
# Paperless supports amd64, arm and arm64 hardware.
|
||||
#
|
||||
# All compose files of paperless configure paperless in the following way:
|
||||
#
|
||||
# - Paperless is (re)started on system boot, if it was running before shutdown.
|
||||
# - Docker volumes for storing data are managed by Docker.
|
||||
# - Folders for importing and exporting files are created in the same directory
|
||||
# as this file and mounted to the correct folders inside the container.
|
||||
# - Paperless listens on port 8000.
|
||||
#
|
||||
# In addition to that, this docker-compose file adds the following optional
|
||||
# configurations:
|
||||
#
|
||||
# - Instead of SQLite (default), MariaDB is used as the database server.
|
||||
# - Apache Tika and Gotenberg servers are started with paperless and paperless
|
||||
# is configured to use these services. These provide support for consuming
|
||||
# Office documents (Word, Excel, Power Point and their LibreOffice counter-
|
||||
# parts.
|
||||
#
|
||||
# To install and update paperless with this file, do the following:
|
||||
#
|
||||
# - Copy this file as 'docker-compose.yml' and the files 'docker-compose.env'
|
||||
# and '.env' into a folder.
|
||||
# - Run 'docker-compose pull'.
|
||||
# - Run 'docker-compose run --rm webserver createsuperuser' to create a user.
|
||||
# - Run 'docker-compose up -d'.
|
||||
#
|
||||
# For more extensive installation and update instructions, refer to the
|
||||
# documentation.
|
||||
|
||||
version: "3.4"
|
||||
services:
|
||||
broker:
|
||||
image: docker.io/library/redis:7
|
||||
restart: unless-stopped
|
||||
volumes:
|
||||
- redisdata:/data
|
||||
|
||||
db:
|
||||
image: docker.io/library/mariadb:10
|
||||
restart: unless-stopped
|
||||
volumes:
|
||||
- dbdata:/var/lib/mysql
|
||||
environment:
|
||||
MARIADB_HOST: paperless
|
||||
MARIADB_DATABASE: paperless
|
||||
MARIADB_USER: paperless
|
||||
MARIADB_PASSWORD: paperless
|
||||
MARIADB_ROOT_PASSWORD: paperless
|
||||
ports:
|
||||
- "3306:3306"
|
||||
|
||||
webserver:
|
||||
image: ghcr.io/paperless-ngx/paperless-ngx:latest
|
||||
restart: unless-stopped
|
||||
depends_on:
|
||||
- db
|
||||
- broker
|
||||
- gotenberg
|
||||
- tika
|
||||
ports:
|
||||
- 8000:8000
|
||||
healthcheck:
|
||||
test: ["CMD", "curl", "-f", "http://localhost:8000"]
|
||||
interval: 30s
|
||||
timeout: 10s
|
||||
retries: 5
|
||||
volumes:
|
||||
- data:/usr/src/paperless/data
|
||||
- media:/usr/src/paperless/media
|
||||
- ./export:/usr/src/paperless/export
|
||||
- ./consume:/usr/src/paperless/consume
|
||||
env_file: docker-compose.env
|
||||
environment:
|
||||
PAPERLESS_REDIS: redis://broker:6379
|
||||
PAPERLESS_DBENGINE: mariadb
|
||||
PAPERLESS_DBHOST: db
|
||||
PAPERLESS_DBUSER: paperless
|
||||
PAPERLESS_DBPASSWORD: paperless
|
||||
PAPERLESS_DBPORT: 3306
|
||||
PAPERLESS_TIKA_ENABLED: 1
|
||||
PAPERLESS_TIKA_GOTENBERG_ENDPOINT: http://gotenberg:3000
|
||||
PAPERLESS_TIKA_ENDPOINT: http://tika:9998
|
||||
|
||||
gotenberg:
|
||||
image: docker.io/gotenberg/gotenberg:7.4
|
||||
restart: unless-stopped
|
||||
environment:
|
||||
CHROMIUM_DISABLE_ROUTES: 1
|
||||
|
||||
tika:
|
||||
image: ghcr.io/paperless-ngx/tika:latest
|
||||
restart: unless-stopped
|
||||
|
||||
volumes:
|
||||
data:
|
||||
media:
|
||||
dbdata:
|
||||
redisdata:
|
83
docker/compose/docker-compose.mariadb.yml
Normal file
@@ -0,0 +1,83 @@
|
||||
# docker-compose file for running paperless from the Docker Hub.
|
||||
# This file contains everything paperless needs to run.
|
||||
# Paperless supports amd64, arm and arm64 hardware.
|
||||
#
|
||||
# All compose files of paperless configure paperless in the following way:
|
||||
#
|
||||
# - Paperless is (re)started on system boot, if it was running before shutdown.
|
||||
# - Docker volumes for storing data are managed by Docker.
|
||||
# - Folders for importing and exporting files are created in the same directory
|
||||
# as this file and mounted to the correct folders inside the container.
|
||||
# - Paperless listens on port 8000.
|
||||
#
|
||||
# In addition to that, this docker-compose file adds the following optional
|
||||
# configurations:
|
||||
#
|
||||
# - Instead of SQLite (default), MariaDB is used as the database server.
|
||||
#
|
||||
# To install and update paperless with this file, do the following:
|
||||
#
|
||||
# - Copy this file as 'docker-compose.yml' and the files 'docker-compose.env'
|
||||
# and '.env' into a folder.
|
||||
# - Run 'docker-compose pull'.
|
||||
# - Run 'docker-compose run --rm webserver createsuperuser' to create a user.
|
||||
# - Run 'docker-compose up -d'.
|
||||
#
|
||||
# For more extensive installation and update instructions, refer to the
|
||||
# documentation.
|
||||
|
||||
version: "3.4"
|
||||
services:
|
||||
broker:
|
||||
image: docker.io/library/redis:7
|
||||
restart: unless-stopped
|
||||
volumes:
|
||||
- redisdata:/data
|
||||
|
||||
db:
|
||||
image: docker.io/library/mariadb:10
|
||||
restart: unless-stopped
|
||||
volumes:
|
||||
- dbdata:/var/lib/mysql
|
||||
environment:
|
||||
MARIADB_HOST: paperless
|
||||
MARIADB_DATABASE: paperless
|
||||
MARIADB_USER: paperless
|
||||
MARIADB_PASSWORD: paperless
|
||||
MARIADB_ROOT_PASSWORD: paperless
|
||||
ports:
|
||||
- "3306:3306"
|
||||
|
||||
webserver:
|
||||
image: ghcr.io/paperless-ngx/paperless-ngx:latest
|
||||
restart: unless-stopped
|
||||
depends_on:
|
||||
- db
|
||||
- broker
|
||||
ports:
|
||||
- 8000:8000
|
||||
healthcheck:
|
||||
test: ["CMD", "curl", "-f", "http://localhost:8000"]
|
||||
interval: 30s
|
||||
timeout: 10s
|
||||
retries: 5
|
||||
volumes:
|
||||
- data:/usr/src/paperless/data
|
||||
- media:/usr/src/paperless/media
|
||||
- ./export:/usr/src/paperless/export
|
||||
- ./consume:/usr/src/paperless/consume
|
||||
env_file: docker-compose.env
|
||||
environment:
|
||||
PAPERLESS_REDIS: redis://broker:6379
|
||||
PAPERLESS_DBENGINE: mariadb
|
||||
PAPERLESS_DBHOST: db
|
||||
PAPERLESS_DBUSER: paperless
|
||||
PAPERLESS_DBPASSWORD: paperless
|
||||
PAPERLESS_DBPORT: 3306
|
||||
|
||||
|
||||
volumes:
|
||||
data:
|
||||
media:
|
||||
dbdata:
|
||||
redisdata:
|
@@ -31,11 +31,13 @@
|
||||
version: "3.4"
|
||||
services:
|
||||
broker:
|
||||
image: redis:6.0
|
||||
image: docker.io/library/redis:7
|
||||
restart: unless-stopped
|
||||
volumes:
|
||||
- redisdata:/data
|
||||
|
||||
db:
|
||||
image: postgres:13
|
||||
image: docker.io/library/postgres:13
|
||||
restart: unless-stopped
|
||||
volumes:
|
||||
- pgdata:/var/lib/postgresql/data
|
||||
@@ -45,7 +47,7 @@ services:
|
||||
POSTGRES_PASSWORD: paperless
|
||||
|
||||
webserver:
|
||||
image: jonaswinkler/paperless-ng:latest
|
||||
image: ghcr.io/paperless-ngx/paperless-ngx:latest
|
||||
restart: unless-stopped
|
||||
depends_on:
|
||||
- db
|
||||
@@ -53,7 +55,7 @@ services:
|
||||
ports:
|
||||
- 8010:8000
|
||||
healthcheck:
|
||||
test: ["CMD", "curl", "-f", "http://localhost:8000"]
|
||||
test: ["CMD", "curl", "-fs", "-S", "--max-time", "2", "http://localhost:8000"]
|
||||
interval: 30s
|
||||
timeout: 10s
|
||||
retries: 5
|
||||
@@ -92,3 +94,4 @@ volumes:
|
||||
data:
|
||||
media:
|
||||
pgdata:
|
||||
redisdata:
|
||||
|
@@ -1,4 +1,4 @@
|
||||
# docker-compose file for running paperless from the Docker Hub.
|
||||
# docker-compose file for running paperless from the docker container registry.
|
||||
# This file contains everything paperless needs to run.
|
||||
# Paperless supports amd64, arm and arm64 hardware.
|
||||
#
|
||||
@@ -33,11 +33,13 @@
|
||||
version: "3.4"
|
||||
services:
|
||||
broker:
|
||||
image: redis:6.0
|
||||
image: docker.io/library/redis:7
|
||||
restart: unless-stopped
|
||||
volumes:
|
||||
- redisdata:/data
|
||||
|
||||
db:
|
||||
image: postgres:13
|
||||
image: docker.io/library/postgres:13
|
||||
restart: unless-stopped
|
||||
volumes:
|
||||
- pgdata:/var/lib/postgresql/data
|
||||
@@ -47,7 +49,7 @@ services:
|
||||
POSTGRES_PASSWORD: paperless
|
||||
|
||||
webserver:
|
||||
image: jonaswinkler/paperless-ng:latest
|
||||
image: ghcr.io/paperless-ngx/paperless-ngx:latest
|
||||
restart: unless-stopped
|
||||
depends_on:
|
||||
- db
|
||||
@@ -57,7 +59,7 @@ services:
|
||||
ports:
|
||||
- 8000:8000
|
||||
healthcheck:
|
||||
test: ["CMD", "curl", "-f", "http://localhost:8000"]
|
||||
test: ["CMD", "curl", "-fs", "-S", "--max-time", "2", "http://localhost:8000"]
|
||||
interval: 30s
|
||||
timeout: 10s
|
||||
retries: 5
|
||||
@@ -75,16 +77,18 @@ services:
|
||||
PAPERLESS_TIKA_ENDPOINT: http://tika:9998
|
||||
|
||||
gotenberg:
|
||||
image: thecodingmachine/gotenberg
|
||||
image: docker.io/gotenberg/gotenberg:7.4
|
||||
restart: unless-stopped
|
||||
environment:
|
||||
DISABLE_GOOGLE_CHROME: 1
|
||||
command:
|
||||
- "gotenberg"
|
||||
- "--chromium-disable-routes=true"
|
||||
|
||||
tika:
|
||||
image: apache/tika
|
||||
image: ghcr.io/paperless-ngx/tika:latest
|
||||
restart: unless-stopped
|
||||
|
||||
volumes:
|
||||
data:
|
||||
media:
|
||||
pgdata:
|
||||
redisdata:
|
||||
|
@@ -29,11 +29,13 @@
|
||||
version: "3.4"
|
||||
services:
|
||||
broker:
|
||||
image: redis:6.0
|
||||
image: docker.io/library/redis:7
|
||||
restart: unless-stopped
|
||||
volumes:
|
||||
- redisdata:/data
|
||||
|
||||
db:
|
||||
image: postgres:13
|
||||
image: docker.io/library/postgres:13
|
||||
restart: unless-stopped
|
||||
volumes:
|
||||
- pgdata:/var/lib/postgresql/data
|
||||
@@ -43,7 +45,7 @@ services:
|
||||
POSTGRES_PASSWORD: paperless
|
||||
|
||||
webserver:
|
||||
image: jonaswinkler/paperless-ng:latest
|
||||
image: ghcr.io/paperless-ngx/paperless-ngx:latest
|
||||
restart: unless-stopped
|
||||
depends_on:
|
||||
- db
|
||||
@@ -51,7 +53,7 @@ services:
|
||||
ports:
|
||||
- 8000:8000
|
||||
healthcheck:
|
||||
test: ["CMD", "curl", "-f", "http://localhost:8000"]
|
||||
test: ["CMD", "curl", "-fs", "-S", "--max-time", "2", "http://localhost:8000"]
|
||||
interval: 30s
|
||||
timeout: 10s
|
||||
retries: 5
|
||||
@@ -70,3 +72,4 @@ volumes:
|
||||
data:
|
||||
media:
|
||||
pgdata:
|
||||
redisdata:
|
||||
|
@@ -1,7 +1,6 @@
|
||||
# docker-compose file for running paperless from the Docker Hub.
|
||||
# docker-compose file for running paperless from the docker container registry.
|
||||
# This file contains everything paperless needs to run.
|
||||
# Paperless supports amd64, arm and arm64 hardware.
|
||||
#
|
||||
# All compose files of paperless configure paperless in the following way:
|
||||
#
|
||||
# - Paperless is (re)started on system boot, if it was running before shutdown.
|
||||
@@ -34,11 +33,13 @@
|
||||
version: "3.4"
|
||||
services:
|
||||
broker:
|
||||
image: redis:6.0
|
||||
image: docker.io/library/redis:7
|
||||
restart: unless-stopped
|
||||
volumes:
|
||||
- redisdata:/data
|
||||
|
||||
webserver:
|
||||
image: jonaswinkler/paperless-ng:latest
|
||||
image: ghcr.io/paperless-ngx/paperless-ngx:latest
|
||||
restart: unless-stopped
|
||||
depends_on:
|
||||
- broker
|
||||
@@ -47,7 +48,7 @@ services:
|
||||
ports:
|
||||
- 8000:8000
|
||||
healthcheck:
|
||||
test: ["CMD", "curl", "-f", "http://localhost:8000"]
|
||||
test: ["CMD", "curl", "-fs", "-S", "--max-time", "2", "http://localhost:8000"]
|
||||
interval: 30s
|
||||
timeout: 10s
|
||||
retries: 5
|
||||
@@ -64,15 +65,17 @@ services:
|
||||
PAPERLESS_TIKA_ENDPOINT: http://tika:9998
|
||||
|
||||
gotenberg:
|
||||
image: thecodingmachine/gotenberg
|
||||
image: docker.io/gotenberg/gotenberg:7.4
|
||||
restart: unless-stopped
|
||||
environment:
|
||||
DISABLE_GOOGLE_CHROME: 1
|
||||
command:
|
||||
- "gotenberg"
|
||||
- "--chromium-disable-routes=true"
|
||||
|
||||
tika:
|
||||
image: apache/tika
|
||||
image: ghcr.io/paperless-ngx/tika:latest
|
||||
restart: unless-stopped
|
||||
|
||||
volumes:
|
||||
data:
|
||||
media:
|
||||
redisdata:
|
||||
|
@@ -26,18 +26,20 @@
|
||||
version: "3.4"
|
||||
services:
|
||||
broker:
|
||||
image: redis:6.0
|
||||
image: docker.io/library/redis:7
|
||||
restart: unless-stopped
|
||||
volumes:
|
||||
- redisdata:/data
|
||||
|
||||
webserver:
|
||||
image: jonaswinkler/paperless-ng:latest
|
||||
image: ghcr.io/paperless-ngx/paperless-ngx:latest
|
||||
restart: unless-stopped
|
||||
depends_on:
|
||||
- broker
|
||||
ports:
|
||||
- 8000:8000
|
||||
healthcheck:
|
||||
test: ["CMD", "curl", "-f", "http://localhost:8000"]
|
||||
test: ["CMD", "curl", "-fs", "-S", "--max-time", "2", "http://localhost:8000"]
|
||||
interval: 30s
|
||||
timeout: 10s
|
||||
retries: 5
|
||||
@@ -54,3 +56,4 @@ services:
|
||||
volumes:
|
||||
data:
|
||||
media:
|
||||
redisdata:
|
||||
|
@@ -1,7 +1,38 @@
|
||||
#!/bin/bash
|
||||
#!/usr/bin/env bash
|
||||
|
||||
set -e
|
||||
|
||||
# Adapted from:
|
||||
# https://github.com/docker-library/postgres/blob/master/docker-entrypoint.sh
|
||||
# usage: file_env VAR
|
||||
# ie: file_env 'XYZ_DB_PASSWORD' will allow for "$XYZ_DB_PASSWORD_FILE" to
|
||||
# fill in the value of "$XYZ_DB_PASSWORD" from a file, especially for Docker's
|
||||
# secrets feature
|
||||
file_env() {
|
||||
local var="$1"
|
||||
local fileVar="${var}_FILE"
|
||||
|
||||
# Basic validation
|
||||
if [ "${!var:-}" ] && [ "${!fileVar:-}" ]; then
|
||||
echo >&2 "error: both $var and $fileVar are set (but are exclusive)"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Only export var if the _FILE exists
|
||||
if [ "${!fileVar:-}" ]; then
|
||||
# And the file exists
|
||||
if [[ -f ${!fileVar} ]]; then
|
||||
echo "Setting ${var} from file"
|
||||
val="$(< "${!fileVar}")"
|
||||
export "$var"="$val"
|
||||
else
|
||||
echo "File ${!fileVar} doesn't exist"
|
||||
exit 1
|
||||
fi
|
||||
fi
|
||||
|
||||
}
|
||||
|
||||
# Source: https://github.com/sameersbn/docker-gitlab/
|
||||
map_uidgid() {
|
||||
USERMAP_ORIG_UID=$(id -u paperless)
|
||||
@@ -10,31 +41,71 @@ map_uidgid() {
|
||||
USERMAP_NEW_GID=${USERMAP_GID:-${USERMAP_ORIG_GID:-$USERMAP_NEW_UID}}
|
||||
if [[ ${USERMAP_NEW_UID} != "${USERMAP_ORIG_UID}" || ${USERMAP_NEW_GID} != "${USERMAP_ORIG_GID}" ]]; then
|
||||
echo "Mapping UID and GID for paperless:paperless to $USERMAP_NEW_UID:$USERMAP_NEW_GID"
|
||||
usermod -u "${USERMAP_NEW_UID}" paperless
|
||||
usermod -o -u "${USERMAP_NEW_UID}" paperless
|
||||
groupmod -o -g "${USERMAP_NEW_GID}" paperless
|
||||
fi
|
||||
}
|
||||
|
||||
map_folders() {
|
||||
# Export these so they can be used in docker-prepare.sh
|
||||
export DATA_DIR="${PAPERLESS_DATA_DIR:-/usr/src/paperless/data}"
|
||||
export MEDIA_ROOT_DIR="${PAPERLESS_MEDIA_ROOT:-/usr/src/paperless/media}"
|
||||
export CONSUME_DIR="${PAPERLESS_CONSUMPTION_DIR:-/usr/src/paperless/consume}"
|
||||
}
|
||||
|
||||
initialize() {
|
||||
|
||||
# Setup environment from secrets before anything else
|
||||
for env_var in \
|
||||
PAPERLESS_DBUSER \
|
||||
PAPERLESS_DBPASS \
|
||||
PAPERLESS_SECRET_KEY \
|
||||
PAPERLESS_AUTO_LOGIN_USERNAME \
|
||||
PAPERLESS_ADMIN_USER \
|
||||
PAPERLESS_ADMIN_MAIL \
|
||||
PAPERLESS_ADMIN_PASSWORD \
|
||||
PAPERLESS_REDIS; do
|
||||
# Check for a version of this var with _FILE appended
|
||||
# and convert the contents to the env var value
|
||||
file_env ${env_var}
|
||||
done
|
||||
|
||||
# Change the user and group IDs if needed
|
||||
map_uidgid
|
||||
|
||||
for dir in export data data/index media media/documents media/documents/originals media/documents/thumbnails; do
|
||||
if [[ ! -d "../$dir" ]]; then
|
||||
echo "Creating directory ../$dir"
|
||||
mkdir ../$dir
|
||||
# Check for overrides of certain folders
|
||||
map_folders
|
||||
|
||||
local export_dir="/usr/src/paperless/export"
|
||||
|
||||
for dir in \
|
||||
"${export_dir}" \
|
||||
"${DATA_DIR}" "${DATA_DIR}/index" \
|
||||
"${MEDIA_ROOT_DIR}" "${MEDIA_ROOT_DIR}/documents" "${MEDIA_ROOT_DIR}/documents/originals" "${MEDIA_ROOT_DIR}/documents/thumbnails" \
|
||||
"${CONSUME_DIR}"; do
|
||||
if [[ ! -d "${dir}" ]]; then
|
||||
echo "Creating directory ${dir}"
|
||||
mkdir "${dir}"
|
||||
fi
|
||||
done
|
||||
|
||||
echo "Creating directory /tmp/paperless"
|
||||
mkdir -p /tmp/paperless
|
||||
local tmp_dir="/tmp/paperless"
|
||||
echo "Creating directory ${tmp_dir}"
|
||||
mkdir -p "${tmp_dir}"
|
||||
|
||||
set +e
|
||||
echo "Adjusting permissions of paperless files. This may take a while."
|
||||
chown -R paperless:paperless /tmp/paperless
|
||||
find .. -not \( -user paperless -and -group paperless \) -exec chown paperless:paperless {} +
|
||||
chown -R paperless:paperless ${tmp_dir}
|
||||
for dir in \
|
||||
"${export_dir}" \
|
||||
"${DATA_DIR}" \
|
||||
"${MEDIA_ROOT_DIR}" \
|
||||
"${CONSUME_DIR}"; do
|
||||
find "${dir}" -not \( -user paperless -and -group paperless \) -exec chown paperless:paperless {} +
|
||||
done
|
||||
set -e
|
||||
|
||||
gosu paperless /sbin/docker-prepare.sh
|
||||
"${gosu_cmd[@]}" /sbin/docker-prepare.sh
|
||||
}
|
||||
|
||||
install_languages() {
|
||||
@@ -56,12 +127,12 @@ install_languages() {
|
||||
# continue
|
||||
#fi
|
||||
|
||||
if dpkg -s $pkg &>/dev/null; then
|
||||
if dpkg -s "$pkg" &>/dev/null; then
|
||||
echo "Package $pkg already installed!"
|
||||
continue
|
||||
fi
|
||||
|
||||
if ! apt-cache show $pkg &>/dev/null; then
|
||||
if ! apt-cache show "$pkg" &>/dev/null; then
|
||||
echo "Package $pkg not found! :("
|
||||
continue
|
||||
fi
|
||||
@@ -74,10 +145,15 @@ install_languages() {
|
||||
done
|
||||
}
|
||||
|
||||
echo "Paperless-ng docker container starting..."
|
||||
echo "Paperless-ngx docker container starting..."
|
||||
|
||||
gosu_cmd=(gosu paperless)
|
||||
if [ "$(id -u)" == "$(id -u paperless)" ]; then
|
||||
gosu_cmd=()
|
||||
fi
|
||||
|
||||
# Install additional languages if specified
|
||||
if [[ ! -z "$PAPERLESS_OCR_LANGUAGES" ]]; then
|
||||
if [[ -n "$PAPERLESS_OCR_LANGUAGES" ]]; then
|
||||
install_languages "$PAPERLESS_OCR_LANGUAGES"
|
||||
fi
|
||||
|
||||
@@ -85,7 +161,7 @@ initialize
|
||||
|
||||
if [[ "$1" != "/"* ]]; then
|
||||
echo Executing management command "$@"
|
||||
exec gosu paperless python3 manage.py "$@"
|
||||
exec "${gosu_cmd[@]}" python3 manage.py "$@"
|
||||
else
|
||||
echo Executing "$@"
|
||||
exec "$@"
|
||||
|
@@ -1,19 +1,19 @@
|
||||
#!/usr/bin/env bash
|
||||
|
||||
set -e
|
||||
|
||||
wait_for_postgres() {
|
||||
attempt_num=1
|
||||
max_attempts=5
|
||||
local attempt_num=1
|
||||
local max_attempts=5
|
||||
|
||||
echo "Waiting for PostgreSQL to start..."
|
||||
|
||||
host="${PAPERLESS_DBHOST}"
|
||||
port="${PAPERLESS_DBPORT}"
|
||||
local host="${PAPERLESS_DBHOST:-localhost}"
|
||||
local port="${PAPERLESS_DBPORT:-5432}"
|
||||
|
||||
if [[ -z $port ]]; then
|
||||
port="5432"
|
||||
fi
|
||||
|
||||
while ! </dev/tcp/$host/$port; do
|
||||
# Disable warning, host and port can't have spaces
|
||||
# shellcheck disable=SC2086
|
||||
while [ ! "$(pg_isready -h ${host} -p ${port})" ]; do
|
||||
|
||||
if [ $attempt_num -eq $max_attempts ]; then
|
||||
echo "Unable to connect to database."
|
||||
@@ -23,11 +23,43 @@ wait_for_postgres() {
|
||||
|
||||
fi
|
||||
|
||||
attempt_num=$(expr "$attempt_num" + 1)
|
||||
attempt_num=$(("$attempt_num" + 1))
|
||||
sleep 5
|
||||
done
|
||||
}
|
||||
|
||||
wait_for_mariadb() {
|
||||
echo "Waiting for MariaDB to start..."
|
||||
|
||||
host="${PAPERLESS_DBHOST:=localhost}"
|
||||
port="${PAPERLESS_DBPORT:=3306}"
|
||||
|
||||
attempt_num=1
|
||||
max_attempts=5
|
||||
|
||||
while ! true > /dev/tcp/$host/$port; do
|
||||
|
||||
if [ $attempt_num -eq $max_attempts ]; then
|
||||
echo "Unable to connect to database."
|
||||
exit 1
|
||||
else
|
||||
echo "Attempt $attempt_num failed! Trying again in 5 seconds..."
|
||||
|
||||
fi
|
||||
|
||||
attempt_num=$(("$attempt_num" + 1))
|
||||
sleep 5
|
||||
done
|
||||
}
|
||||
|
||||
wait_for_redis() {
|
||||
# We use a Python script to send the Redis ping
|
||||
# instead of installing redis-tools just for 1 thing
|
||||
if ! python3 /sbin/wait-for-redis.py; then
|
||||
exit 1
|
||||
fi
|
||||
}
|
||||
|
||||
migrations() {
|
||||
(
|
||||
# flock is in place to prevent multiple containers from doing migrations
|
||||
@@ -36,17 +68,18 @@ migrations() {
|
||||
flock 200
|
||||
echo "Apply database migrations..."
|
||||
python3 manage.py migrate
|
||||
) 200>/usr/src/paperless/data/migration_lock
|
||||
) 200>"${DATA_DIR}/migration_lock"
|
||||
}
|
||||
|
||||
search_index() {
|
||||
index_version=1
|
||||
index_version_file=/usr/src/paperless/data/.index_version
|
||||
|
||||
if [[ (! -f "$index_version_file") || $(<$index_version_file) != "$index_version" ]]; then
|
||||
local index_version=1
|
||||
local index_version_file=${DATA_DIR}/.index_version
|
||||
|
||||
if [[ (! -f "${index_version_file}") || $(<"${index_version_file}") != "$index_version" ]]; then
|
||||
echo "Search index out of date. Updating..."
|
||||
python3 manage.py document_index reindex
|
||||
echo $index_version | tee $index_version_file >/dev/null
|
||||
python3 manage.py document_index reindex --no-progress-bar
|
||||
echo ${index_version} | tee "${index_version_file}" >/dev/null
|
||||
fi
|
||||
}
|
||||
|
||||
@@ -57,10 +90,14 @@ superuser() {
|
||||
}
|
||||
|
||||
do_work() {
|
||||
if [[ -n "${PAPERLESS_DBHOST}" ]]; then
|
||||
if [[ "${PAPERLESS_DBENGINE}" == "mariadb" ]]; then
|
||||
wait_for_mariadb
|
||||
elif [[ -n "${PAPERLESS_DBHOST}" ]]; then
|
||||
wait_for_postgres
|
||||
fi
|
||||
|
||||
wait_for_redis
|
||||
|
||||
migrations
|
||||
|
||||
search_index
|
||||
|
@@ -1,4 +1,19 @@
|
||||
for command in document_archiver document_exporter document_importer mail_fetcher document_create_classifier document_index document_renamer document_retagger document_thumbnails document_sanity_checker manage_superuser;
|
||||
#!/usr/bin/env bash
|
||||
|
||||
set -eu
|
||||
|
||||
for command in decrypt_documents \
|
||||
document_archiver \
|
||||
document_exporter \
|
||||
document_importer \
|
||||
mail_fetcher \
|
||||
document_create_classifier \
|
||||
document_index \
|
||||
document_renamer \
|
||||
document_retagger \
|
||||
document_thumbnails \
|
||||
document_sanity_checker \
|
||||
manage_superuser;
|
||||
do
|
||||
echo "installing $command..."
|
||||
sed "s/management_command/$command/g" management_script.sh > /usr/local/bin/$command
|
||||
|
@@ -1,4 +1,4 @@
|
||||
#!/bin/bash
|
||||
#!/usr/bin/env bash
|
||||
|
||||
set -e
|
||||
|
||||
@@ -6,10 +6,10 @@ cd /usr/src/paperless/src/
|
||||
|
||||
if [[ $(id -u) == 0 ]] ;
|
||||
then
|
||||
gosu paperless python3 manage.py management_command "$@"
|
||||
gosu paperless python3 manage.py management_command "$@"
|
||||
elif [[ $(id -un) == "paperless" ]] ;
|
||||
then
|
||||
python3 manage.py management_command "$@"
|
||||
python3 manage.py management_command "$@"
|
||||
else
|
||||
echo "Unknown user."
|
||||
echo "Unknown user."
|
||||
fi
|
||||
|
15
docker/paperless_cmd.sh
Executable file
@@ -0,0 +1,15 @@
|
||||
#!/usr/bin/env bash
|
||||
|
||||
rootless_args=()
|
||||
if [ "$(id -u)" == "$(id -u paperless)" ]; then
|
||||
rootless_args=(
|
||||
--user
|
||||
paperless
|
||||
--logfile
|
||||
supervisord.log
|
||||
--pidfile
|
||||
supervisord.pid
|
||||
)
|
||||
fi
|
||||
|
||||
exec /usr/local/bin/supervisord -c /etc/supervisord.conf "${rootless_args[@]}"
|
@@ -19,6 +19,7 @@ stderr_logfile_maxbytes=0
|
||||
[program:consumer]
|
||||
command=python3 manage.py document_consumer
|
||||
user=paperless
|
||||
stopsignal=INT
|
||||
|
||||
stdout_logfile=/dev/stdout
|
||||
stdout_logfile_maxbytes=0
|
||||
@@ -28,6 +29,7 @@ stderr_logfile_maxbytes=0
|
||||
[program:scheduler]
|
||||
command=python3 manage.py qcluster
|
||||
user=paperless
|
||||
stopasgroup = true
|
||||
|
||||
stdout_logfile=/dev/stdout
|
||||
stdout_logfile_maxbytes=0
|
||||
|
44
docker/wait-for-redis.py
Executable file
@@ -0,0 +1,44 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Simple script which attempts to ping the Redis broker as set in the environment for
|
||||
a certain number of times, waiting a little bit in between
|
||||
|
||||
"""
|
||||
import os
|
||||
import sys
|
||||
import time
|
||||
from typing import Final
|
||||
|
||||
from redis import Redis
|
||||
|
||||
if __name__ == "__main__":
|
||||
|
||||
MAX_RETRY_COUNT: Final[int] = 5
|
||||
RETRY_SLEEP_SECONDS: Final[int] = 5
|
||||
|
||||
REDIS_URL: Final[str] = os.getenv("PAPERLESS_REDIS", "redis://localhost:6379")
|
||||
|
||||
print(f"Waiting for Redis...", flush=True)
|
||||
|
||||
attempt = 0
|
||||
with Redis.from_url(url=REDIS_URL) as client:
|
||||
while attempt < MAX_RETRY_COUNT:
|
||||
try:
|
||||
client.ping()
|
||||
break
|
||||
except Exception as e:
|
||||
print(
|
||||
f"Redis ping #{attempt} failed.\n"
|
||||
f"Error: {str(e)}.\n"
|
||||
f"Waiting {RETRY_SLEEP_SECONDS}s",
|
||||
flush=True,
|
||||
)
|
||||
time.sleep(RETRY_SLEEP_SECONDS)
|
||||
attempt += 1
|
||||
|
||||
if attempt >= MAX_RETRY_COUNT:
|
||||
print(f"Failed to connect to redis using environment variable PAPERLESS_REDIS.")
|
||||
sys.exit(os.EX_UNAVAILABLE)
|
||||
else:
|
||||
print(f"Connected to Redis broker.")
|
||||
sys.exit(os.EX_OK)
|
@@ -1,18 +0,0 @@
|
||||
FROM python:3.5.1
|
||||
MAINTAINER Pit Kleyersburg <pitkley@googlemail.com>
|
||||
|
||||
# Install Sphinx and Pygments
|
||||
RUN pip install Sphinx Pygments
|
||||
|
||||
# Setup directories, copy data
|
||||
RUN mkdir /build
|
||||
COPY . /build
|
||||
WORKDIR /build/docs
|
||||
|
||||
# Build documentation
|
||||
RUN make html
|
||||
|
||||
# Start webserver
|
||||
WORKDIR /build/docs/_build/html
|
||||
EXPOSE 8000/tcp
|
||||
CMD ["python3", "-m", "http.server"]
|
@@ -24,6 +24,7 @@ I18NSPHINXOPTS = $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) .
|
||||
help:
|
||||
@echo "Please use \`make <target>' where <target> is one of"
|
||||
@echo " html to make standalone HTML files"
|
||||
@echo " livehtml to preview changes with live reload in your browser"
|
||||
@echo " dirhtml to make HTML files named index.html in directories"
|
||||
@echo " singlehtml to make a single large HTML file"
|
||||
@echo " pickle to make pickle files"
|
||||
@@ -54,6 +55,9 @@ html:
|
||||
@echo
|
||||
@echo "Build finished. The HTML pages are in $(BUILDDIR)/html."
|
||||
|
||||
livehtml:
|
||||
sphinx-autobuild "./" "$(BUILDDIR)" $(O)
|
||||
|
||||
dirhtml:
|
||||
$(SPHINXBUILD) -b dirhtml $(ALLSPHINXOPTS) $(BUILDDIR)/dirhtml
|
||||
@echo
|
||||
|
597
docs/_static/css/custom.css
vendored
Normal file
@@ -0,0 +1,597 @@
|
||||
/* Variables */
|
||||
:root {
|
||||
--color-text-body: #5c5962;
|
||||
--color-text-body-light: #fcfcfc;
|
||||
--color-text-anchor: #7253ed;
|
||||
--color-text-alt: rgba(0, 0, 0, 0.3);
|
||||
--color-text-title: #27262b;
|
||||
--color-text-code-inline: #e74c3c;
|
||||
--color-text-code-nt: #062873;
|
||||
--color-text-selection: #b19eff;
|
||||
--color-bg-body: #fcfcfc;
|
||||
--color-bg-body-alt: #f3f6f6;
|
||||
--color-bg-side-nav: #f5f6fa;
|
||||
--color-bg-side-nav-hover: #ebedf5;
|
||||
--color-bg-code-block: var(--color-bg-side-nav);
|
||||
--color-border: #eeebee;
|
||||
--color-btn-neutral-bg: #f3f6f6;
|
||||
--color-btn-neutral-bg-hover: #e5ebeb;
|
||||
--color-success-title: #1abc9c;
|
||||
--color-success-body: #dbfaf4;
|
||||
--color-warning-title: #f0b37e;
|
||||
--color-warning-body: #ffedcc;
|
||||
--color-danger-title: #f29f97;
|
||||
--color-danger-body: #fdf3f2;
|
||||
--color-info-title: #6ab0de;
|
||||
--color-info-body: #e7f2fa;
|
||||
}
|
||||
|
||||
.dark-mode {
|
||||
--color-text-body: #abb2bf;
|
||||
--color-text-body-light: #9499a2;
|
||||
--color-text-alt: rgba(0255, 255, 255, 0.5);
|
||||
--color-text-title: var(--color-text-anchor);
|
||||
--color-text-code-inline: #abb2bf;
|
||||
--color-text-code-nt: #2063f3;
|
||||
--color-text-selection: #030303;
|
||||
--color-bg-body: #1d1d20 !important;
|
||||
--color-bg-body-alt: #131315;
|
||||
--color-bg-side-nav: #18181a;
|
||||
--color-bg-side-nav-hover: #101216;
|
||||
--color-bg-code-block: #101216;
|
||||
--color-border: #47494f;
|
||||
--color-btn-neutral-bg: #242529;
|
||||
--color-btn-neutral-bg-hover: #101216;
|
||||
--color-success-title: #02120f;
|
||||
--color-success-body: #041b17;
|
||||
--color-warning-title: #1b0e03;
|
||||
--color-warning-body: #371d06;
|
||||
--color-danger-title: #120902;
|
||||
--color-danger-body: #1b0503;
|
||||
--color-info-title: #020608;
|
||||
--color-info-body: #06141e;
|
||||
}
|
||||
|
||||
* {
|
||||
transition: background-color 0.3s ease, border-color 0.3s ease;
|
||||
}
|
||||
|
||||
/* Typography */
|
||||
body {
|
||||
font-family: system-ui,-apple-system,BlinkMacSystemFont,"Segoe UI",Roboto,"Helvetica Neue",Arial,sans-serif;
|
||||
font-size: inherit;
|
||||
line-height: 1.4;
|
||||
color: var(--color-text-body);
|
||||
}
|
||||
|
||||
.rst-content p {
|
||||
word-break: break-word;
|
||||
}
|
||||
|
||||
h1, h2, h3, h4, h5, h6 {
|
||||
font-family: inherit;
|
||||
}
|
||||
|
||||
.rst-content .toctree-wrapper>p.caption, .rst-content h1, .rst-content h2, .rst-content h3, .rst-content h4, .rst-content h5, .rst-content h6 {
|
||||
padding-top: .5em;
|
||||
}
|
||||
|
||||
p, .main-content-wrap, .rst-content .section ul, .rst-content .toctree-wrapper ul, .rst-content section ul, .wy-plain-list-disc, article ul {
|
||||
line-height: 1.6;
|
||||
}
|
||||
|
||||
pre, .code, .rst-content .linenodiv pre, .rst-content div[class^=highlight] pre, .rst-content pre.literal-block {
|
||||
font-family: "SFMono-Regular", Menlo,Consolas, Monospace;
|
||||
font-size: 0.75em;
|
||||
line-height: 1.8;
|
||||
}
|
||||
|
||||
.wy-menu-vertical li.toctree-l3,.wy-menu-vertical li.toctree-l4 {
|
||||
font-size: 1rem
|
||||
}
|
||||
|
||||
.rst-versions {
|
||||
font-family: inherit;
|
||||
line-height: 1;
|
||||
}
|
||||
|
||||
footer, footer p {
|
||||
font-size: .8rem;
|
||||
}
|
||||
|
||||
footer .rst-footer-buttons {
|
||||
font-size: 1rem;
|
||||
}
|
||||
|
||||
@media (max-width: 400px) {
|
||||
/* break code lines on mobile */
|
||||
pre, code {
|
||||
word-break: break-word;
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/* Layout */
|
||||
.wy-side-nav-search, .wy-menu-vertical {
|
||||
width: auto;
|
||||
}
|
||||
|
||||
.wy-nav-side {
|
||||
z-index: 0;
|
||||
display: flex;
|
||||
flex-wrap: wrap;
|
||||
background-color: var(--color-bg-side-nav);
|
||||
}
|
||||
|
||||
.wy-side-scroll {
|
||||
width: 100%;
|
||||
overflow-y: auto;
|
||||
}
|
||||
|
||||
@media (min-width: 66.5rem) {
|
||||
.wy-side-scroll {
|
||||
width:264px
|
||||
}
|
||||
}
|
||||
|
||||
@media (min-width: 50rem) {
|
||||
.wy-nav-side {
|
||||
flex-wrap: nowrap;
|
||||
position: fixed;
|
||||
width: 248px;
|
||||
height: 100%;
|
||||
flex-direction: column;
|
||||
border-right: 1px solid var(--color-border);
|
||||
align-items:flex-end
|
||||
}
|
||||
}
|
||||
|
||||
@media (min-width: 66.5rem) {
|
||||
.wy-nav-side {
|
||||
width: calc((100% - 1064px) / 2 + 264px);
|
||||
min-width:264px
|
||||
}
|
||||
}
|
||||
|
||||
@media (min-width: 50rem) {
|
||||
.wy-nav-content-wrap {
|
||||
position: relative;
|
||||
max-width: 800px;
|
||||
margin-left:248px
|
||||
}
|
||||
}
|
||||
|
||||
@media (min-width: 66.5rem) {
|
||||
.wy-nav-content-wrap {
|
||||
margin-left:calc((100% - 1064px) / 2 + 264px)
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/* Colors */
|
||||
body.wy-body-for-nav,
|
||||
.wy-nav-content {
|
||||
background: var(--color-bg-body);
|
||||
}
|
||||
|
||||
.wy-nav-side {
|
||||
border-right: 1px solid var(--color-border);
|
||||
}
|
||||
|
||||
.wy-side-nav-search, .wy-nav-top {
|
||||
background: var(--color-bg-side-nav);
|
||||
border-bottom: 1px solid var(--color-border);
|
||||
}
|
||||
|
||||
.wy-nav-content-wrap {
|
||||
background: inherit;
|
||||
}
|
||||
|
||||
.wy-side-nav-search > a, .wy-nav-top a, .wy-nav-top i {
|
||||
color: var(--color-text-title);
|
||||
}
|
||||
|
||||
.wy-side-nav-search > a:hover, .wy-nav-top a:hover {
|
||||
background: transparent;
|
||||
}
|
||||
|
||||
.wy-side-nav-search > div.version {
|
||||
color: var(--color-text-alt)
|
||||
}
|
||||
|
||||
.wy-side-nav-search > div[role="search"] {
|
||||
border-top: 1px solid var(--color-border);
|
||||
}
|
||||
|
||||
.wy-menu-vertical li.toctree-l2.current>a, .wy-menu-vertical li.toctree-l2.current li.toctree-l3>a,
|
||||
.wy-menu-vertical li.toctree-l3.current>a, .wy-menu-vertical li.toctree-l3.current li.toctree-l4>a {
|
||||
background: var(--color-bg-side-nav);
|
||||
}
|
||||
|
||||
.rst-content .highlighted {
|
||||
background: #eedd85;
|
||||
box-shadow: 0 0 0 2px #eedd85;
|
||||
font-weight: 600;
|
||||
}
|
||||
|
||||
.wy-side-nav-search input[type=text],
|
||||
html.writer-html5 .rst-content table.docutils th {
|
||||
color: var(--color-text-body);
|
||||
}
|
||||
|
||||
.rst-content table.docutils:not(.field-list) tr:nth-child(2n-1) td,
|
||||
.wy-table-backed,
|
||||
.wy-table-odd td,
|
||||
.wy-table-striped tr:nth-child(2n-1) td {
|
||||
background-color: var(--color-bg-body-alt);
|
||||
}
|
||||
|
||||
.rst-content table.docutils,
|
||||
.wy-table-bordered-all,
|
||||
html.writer-html5 .rst-content table.docutils th,
|
||||
.rst-content table.docutils td,
|
||||
.wy-table-bordered-all td,
|
||||
hr {
|
||||
border-color: var(--color-border) !important;
|
||||
}
|
||||
|
||||
::selection {
|
||||
background: var(--color-text-selection);
|
||||
}
|
||||
|
||||
/* Ridiculous rules are taken from sphinx_rtd */
|
||||
.rst-content .admonition-title,
|
||||
.wy-alert-title {
|
||||
color: var(--color-text-body-light);
|
||||
}
|
||||
|
||||
.rst-content .hint,
|
||||
.rst-content .important,
|
||||
.rst-content .tip,
|
||||
.rst-content .wy-alert-success,
|
||||
.wy-alert.wy-alert-success {
|
||||
background: var(--color-success-body);
|
||||
}
|
||||
|
||||
.rst-content .hint .admonition-title,
|
||||
.rst-content .hint .wy-alert-title,
|
||||
.rst-content .important .admonition-title,
|
||||
.rst-content .important .wy-alert-title,
|
||||
.rst-content .tip .admonition-title,
|
||||
.rst-content .tip .wy-alert-title,
|
||||
.rst-content .wy-alert-success .admonition-title,
|
||||
.rst-content .wy-alert-success .wy-alert-title,
|
||||
.wy-alert.wy-alert-success .rst-content .admonition-title,
|
||||
.wy-alert.wy-alert-success .wy-alert-title {
|
||||
background-color: var(--color-success-title);
|
||||
}
|
||||
|
||||
.rst-content .admonition-todo,
|
||||
.rst-content .attention,
|
||||
.rst-content .caution,
|
||||
.rst-content .warning,
|
||||
.rst-content .wy-alert-warning,
|
||||
.wy-alert.wy-alert-warning {
|
||||
background: var(--color-warning-body);
|
||||
}
|
||||
|
||||
.rst-content .admonition-todo .admonition-title,
|
||||
.rst-content .admonition-todo .wy-alert-title,
|
||||
.rst-content .attention .admonition-title,
|
||||
.rst-content .attention .wy-alert-title,
|
||||
.rst-content .caution .admonition-title,
|
||||
.rst-content .caution .wy-alert-title,
|
||||
.rst-content .warning .admonition-title,
|
||||
.rst-content .warning .wy-alert-title,
|
||||
.rst-content .wy-alert-warning .admonition-title,
|
||||
.rst-content .wy-alert-warning .wy-alert-title,
|
||||
.rst-content .wy-alert.wy-alert-warning .admonition-title,
|
||||
.wy-alert.wy-alert-warning .rst-content .admonition-title,
|
||||
.wy-alert.wy-alert-warning .wy-alert-title {
|
||||
background: var(--color-warning-title);
|
||||
}
|
||||
|
||||
.rst-content .danger,
|
||||
.rst-content .error,
|
||||
.rst-content .wy-alert-danger,
|
||||
.wy-alert.wy-alert-danger {
|
||||
background: var(--color-danger-body);
|
||||
}
|
||||
|
||||
.rst-content .danger .admonition-title,
|
||||
.rst-content .danger .wy-alert-title,
|
||||
.rst-content .error .admonition-title,
|
||||
.rst-content .error .wy-alert-title,
|
||||
.rst-content .wy-alert-danger .admonition-title,
|
||||
.rst-content .wy-alert-danger .wy-alert-title,
|
||||
.wy-alert.wy-alert-danger .rst-content .admonition-title,
|
||||
.wy-alert.wy-alert-danger .wy-alert-title {
|
||||
background: var(--color-danger-title);
|
||||
}
|
||||
|
||||
.rst-content .note,
|
||||
.rst-content .seealso,
|
||||
.rst-content .wy-alert-info,
|
||||
.wy-alert.wy-alert-info {
|
||||
background: var(--color-info-body);
|
||||
}
|
||||
|
||||
.rst-content .note .admonition-title,
|
||||
.rst-content .note .wy-alert-title,
|
||||
.rst-content .seealso .admonition-title,
|
||||
.rst-content .seealso .wy-alert-title,
|
||||
.rst-content .wy-alert-info .admonition-title,
|
||||
.rst-content .wy-alert-info .wy-alert-title,
|
||||
.wy-alert.wy-alert-info .rst-content .admonition-title,
|
||||
.wy-alert.wy-alert-info .wy-alert-title {
|
||||
background: var(--color-info-title);
|
||||
}
|
||||
|
||||
|
||||
|
||||
/* Links */
|
||||
a, a:visited,
|
||||
.wy-menu-vertical a,
|
||||
a.icon.icon-home,
|
||||
.wy-menu-vertical li.toctree-l1.current > a.current {
|
||||
color: var(--color-text-anchor);
|
||||
text-decoration: none;
|
||||
}
|
||||
|
||||
a:hover, .wy-breadcrumbs-aside a {
|
||||
color: var(--color-text-anchor); /* reset */
|
||||
}
|
||||
|
||||
.rst-versions a, .rst-versions .rst-current-version {
|
||||
color: #var(--color-text-anchor);
|
||||
}
|
||||
|
||||
.wy-nav-content a.reference, .wy-nav-content a:not([class]) {
|
||||
background-image: linear-gradient(var(--color-border) 0%, var(--color-border) 100%);
|
||||
background-repeat: repeat-x;
|
||||
background-position: 0 100%;
|
||||
background-size: 1px 1px;
|
||||
}
|
||||
|
||||
.wy-nav-content a.reference:hover, .wy-nav-content a:not([class]):hover {
|
||||
background-image: linear-gradient(rgba(114,83,237,0.45) 0%, rgba(114,83,237,0.45) 100%);
|
||||
background-size: 1px 1px;
|
||||
}
|
||||
|
||||
.wy-menu-vertical a:hover,
|
||||
.wy-menu-vertical li.current a:hover,
|
||||
.wy-menu-vertical a:active {
|
||||
background: var(--color-bg-side-nav-hover) !important;
|
||||
color: var(--color-text-body);
|
||||
}
|
||||
|
||||
.wy-menu-vertical li.toctree-l1.current>a,
|
||||
.wy-menu-vertical li.current>a,
|
||||
.wy-menu-vertical li.on a {
|
||||
background-color: var(--color-bg-side-nav-hover);
|
||||
border: none;
|
||||
font-weight: normal;
|
||||
}
|
||||
|
||||
.wy-menu-vertical li.current {
|
||||
background-color: inherit;
|
||||
}
|
||||
|
||||
.wy-menu-vertical li.current a {
|
||||
border-right: none;
|
||||
}
|
||||
|
||||
.wy-menu-vertical li.toctree-l2 a,
|
||||
.wy-menu-vertical li.toctree-l3 a,
|
||||
.wy-menu-vertical li.toctree-l4 a,
|
||||
.wy-menu-vertical li.toctree-l5 a,
|
||||
.wy-menu-vertical li.toctree-l6 a,
|
||||
.wy-menu-vertical li.toctree-l7 a,
|
||||
.wy-menu-vertical li.toctree-l8 a,
|
||||
.wy-menu-vertical li.toctree-l9 a,
|
||||
.wy-menu-vertical li.toctree-l10 a {
|
||||
color: var(--color-text-body);
|
||||
}
|
||||
|
||||
a.image-reference, a.image-reference:hover {
|
||||
background: none !important;
|
||||
}
|
||||
|
||||
a.image-reference img {
|
||||
cursor: zoom-in;
|
||||
}
|
||||
|
||||
|
||||
/* Code blocks */
|
||||
.rst-content code, .rst-content tt, code {
|
||||
padding: 0.25em;
|
||||
font-weight: 400;
|
||||
background-color: var(--color-bg-code-block);
|
||||
border: 1px solid var(--color-border);
|
||||
border-radius: 4px;
|
||||
}
|
||||
|
||||
.rst-content div[class^=highlight], .rst-content pre.literal-block {
|
||||
padding: 0.7rem;
|
||||
margin-top: 0;
|
||||
margin-bottom: 0.75rem;
|
||||
overflow-x: auto;
|
||||
background-color: var(--color-bg-side-nav);
|
||||
border-color: var(--color-border);
|
||||
border-radius: 4px;
|
||||
box-shadow: none;
|
||||
}
|
||||
|
||||
.rst-content .admonition-title,
|
||||
.rst-content div.admonition,
|
||||
.wy-alert-title {
|
||||
padding: 10px 12px;
|
||||
border-top-left-radius: 4px;
|
||||
border-top-right-radius: 4px;
|
||||
}
|
||||
|
||||
.highlight .go {
|
||||
color: inherit;
|
||||
}
|
||||
|
||||
.highlight .nt {
|
||||
color: var(--color-text-code-nt);
|
||||
}
|
||||
|
||||
.rst-content code.literal,
|
||||
.rst-content tt.literal,
|
||||
html.writer-html5 .rst-content dl.footnote code {
|
||||
border-color: var(--color-border);
|
||||
background-color: var(--color-border);
|
||||
color: var(--color-text-code-inline)
|
||||
}
|
||||
|
||||
|
||||
/* Search */
|
||||
.wy-side-nav-search input[type=text] {
|
||||
border: none;
|
||||
border-radius: 0;
|
||||
background-color: transparent;
|
||||
font-family: inherit;
|
||||
font-size: .85rem;
|
||||
box-shadow: none;
|
||||
padding: .7rem 1rem .7rem 2.8rem;
|
||||
margin: 0;
|
||||
}
|
||||
|
||||
#rtd-search-form {
|
||||
position: relative;
|
||||
}
|
||||
|
||||
#rtd-search-form:before {
|
||||
font: normal normal normal 14px/1 FontAwesome;
|
||||
font-size: inherit;
|
||||
text-rendering: auto;
|
||||
-webkit-font-smoothing: antialiased;
|
||||
-moz-osx-font-smoothing: grayscale;
|
||||
content: "\f002";
|
||||
color: var(--color-text-alt);
|
||||
position: absolute;
|
||||
left: 1.5rem;
|
||||
top: .7rem;
|
||||
}
|
||||
|
||||
/* Side nav */
|
||||
.wy-side-nav-search {
|
||||
padding: 1rem 0 0 0;
|
||||
}
|
||||
|
||||
.wy-menu-vertical li a button.toctree-expand {
|
||||
float: right;
|
||||
margin-right: -1.5em;
|
||||
padding: 0 .5em;
|
||||
}
|
||||
|
||||
.wy-menu-vertical a,
|
||||
.wy-menu-vertical li.current>a,
|
||||
.wy-menu-vertical li.current li>a {
|
||||
padding-right: 1.5em !important;
|
||||
}
|
||||
|
||||
.wy-menu-vertical li.current li>a.current {
|
||||
font-weight: 600;
|
||||
}
|
||||
|
||||
/* Misc spacing */
|
||||
.rst-content .admonition-title, .wy-alert-title {
|
||||
padding: 10px 12px;
|
||||
}
|
||||
|
||||
/* Buttons */
|
||||
.btn {
|
||||
display: inline-block;
|
||||
box-sizing: border-box;
|
||||
padding: 0.3em 1em;
|
||||
margin: 0;
|
||||
font-family: inherit;
|
||||
font-size: inherit;
|
||||
font-weight: 500;
|
||||
line-height: 1.5;
|
||||
color: #var(--color-text-anchor);
|
||||
text-decoration: none;
|
||||
vertical-align: baseline;
|
||||
background-color: #f7f7f7;
|
||||
border-width: 0;
|
||||
border-radius: 4px;
|
||||
box-shadow: 0 1px 2px rgba(0,0,0,0.12),0 3px 10px rgba(0,0,0,0.08);
|
||||
appearance: none;
|
||||
}
|
||||
|
||||
.btn:active {
|
||||
padding: 0.3em 1em;
|
||||
}
|
||||
|
||||
.rst-content .btn:focus {
|
||||
outline: 1px solid #ccc;
|
||||
}
|
||||
|
||||
.rst-content .btn-neutral, .rst-content .btn span.fa {
|
||||
color: var(--color-text-body) !important;
|
||||
}
|
||||
|
||||
.btn-neutral {
|
||||
background-color: var(--color-btn-neutral-bg) !important;
|
||||
color: var(--color-btn-neutral-text) !important;
|
||||
border: 1px solid var(--color-btn-neutral-bg);
|
||||
}
|
||||
|
||||
.btn:hover, .btn-neutral:hover {
|
||||
background-color: var(--color-btn-neutral-bg-hover) !important;
|
||||
}
|
||||
|
||||
|
||||
/* Icon overrides */
|
||||
.wy-side-nav-search a.icon-home:before {
|
||||
display: none;
|
||||
}
|
||||
|
||||
.fa-minus-square-o:before,.wy-menu-vertical li.current>a button.toctree-expand:before,.wy-menu-vertical li.on a button.toctree-expand:before {
|
||||
content: "\f106"; /* fa-angle-up */
|
||||
}
|
||||
|
||||
.fa-plus-square-o:before, .wy-menu-vertical li button.toctree-expand:before {
|
||||
content: "\f107"; /* fa-angle-down */
|
||||
}
|
||||
|
||||
|
||||
/* Misc */
|
||||
.wy-nav-top {
|
||||
line-height: 36px;
|
||||
}
|
||||
|
||||
.wy-nav-top > i {
|
||||
font-size: 24px;
|
||||
padding: 8px 0 0 2px;
|
||||
color:#var(--color-text-anchor);
|
||||
}
|
||||
|
||||
.rst-content table.docutils td,
|
||||
.rst-content table.docutils th,
|
||||
.rst-content table.field-list td,
|
||||
.rst-content table.field-list th,
|
||||
.wy-table td,
|
||||
.wy-table th {
|
||||
padding: 8px 14px;
|
||||
}
|
||||
|
||||
.dark-mode-toggle {
|
||||
position: absolute;
|
||||
top: 14px;
|
||||
right: 12px;
|
||||
height: 20px;
|
||||
width: 24px;
|
||||
z-index: 10;
|
||||
border: none;
|
||||
background-color: transparent;
|
||||
color: inherit;
|
||||
opacity: 0.7;
|
||||
}
|
||||
|
||||
.wy-nav-content-wrap {
|
||||
z-index: 20;
|
||||
}
|
14
docs/_static/custom.css
vendored
@@ -1,14 +0,0 @@
|
||||
/* override table width restrictions */
|
||||
@media screen and (min-width: 767px) {
|
||||
|
||||
.wy-table-responsive table td {
|
||||
/* !important prevents the common CSS stylesheets from
|
||||
overriding this as on RTD they are loaded after this stylesheet */
|
||||
white-space: normal !important;
|
||||
}
|
||||
|
||||
.wy-table-responsive {
|
||||
overflow: visible !important;
|
||||
}
|
||||
|
||||
}
|
47
docs/_static/js/darkmode.js
vendored
Normal file
@@ -0,0 +1,47 @@
|
||||
let toggleButton
|
||||
let icon
|
||||
|
||||
function load() {
|
||||
'use strict'
|
||||
|
||||
toggleButton = document.createElement('button')
|
||||
toggleButton.setAttribute('title', 'Toggle dark mode')
|
||||
toggleButton.classList.add('dark-mode-toggle')
|
||||
icon = document.createElement('i')
|
||||
icon.classList.add('fa', darkModeState ? 'fa-sun-o' : 'fa-moon-o')
|
||||
toggleButton.appendChild(icon)
|
||||
document.body.prepend(toggleButton)
|
||||
|
||||
// Listen for changes in the OS settings
|
||||
// addListener is used because older versions of Safari don't support addEventListener
|
||||
// prefersDarkQuery set in <head>
|
||||
if (prefersDarkQuery) {
|
||||
prefersDarkQuery.addListener(function (evt) {
|
||||
toggleDarkMode(evt.matches)
|
||||
})
|
||||
}
|
||||
|
||||
// Initial setting depending on the prefers-color-mode or localstorage
|
||||
// darkModeState should be set in the document <head> to prevent flash
|
||||
if (darkModeState == undefined) darkModeState = false
|
||||
toggleDarkMode(darkModeState)
|
||||
|
||||
// Toggles the "dark-mode" class on click and sets localStorage state
|
||||
toggleButton.addEventListener('click', () => {
|
||||
darkModeState = !darkModeState
|
||||
|
||||
toggleDarkMode(darkModeState)
|
||||
localStorage.setItem('dark-mode', darkModeState)
|
||||
})
|
||||
}
|
||||
|
||||
function toggleDarkMode(state) {
|
||||
document.documentElement.classList.toggle('dark-mode', state)
|
||||
document.documentElement.classList.toggle('light-mode', !state)
|
||||
icon.classList.remove('fa-sun-o')
|
||||
icon.classList.remove('fa-moon-o')
|
||||
icon.classList.add(state ? 'fa-sun-o' : 'fa-moon-o')
|
||||
darkModeState = state
|
||||
}
|
||||
|
||||
document.addEventListener('DOMContentLoaded', load)
|
BIN
docs/_static/screenshot.png
vendored
Before Width: | Height: | Size: 445 KiB |
BIN
docs/_static/screenshots/bulk-edit.png
vendored
Normal file
After Width: | Height: | Size: 661 KiB |
BIN
docs/_static/screenshots/correspondents.png
vendored
Before Width: | Height: | Size: 106 KiB After Width: | Height: | Size: 457 KiB |
BIN
docs/_static/screenshots/dashboard.png
vendored
Before Width: | Height: | Size: 167 KiB After Width: | Height: | Size: 436 KiB |
BIN
docs/_static/screenshots/documents-filter.png
vendored
Before Width: | Height: | Size: 28 KiB After Width: | Height: | Size: 462 KiB |
BIN
docs/_static/screenshots/documents-largecards.png
vendored
Before Width: | Height: | Size: 306 KiB After Width: | Height: | Size: 608 KiB |
BIN
docs/_static/screenshots/documents-smallcards-dark.png
vendored
Normal file
After Width: | Height: | Size: 698 KiB |
BIN
docs/_static/screenshots/documents-smallcards.png
vendored
Before Width: | Height: | Size: 410 KiB After Width: | Height: | Size: 706 KiB |
BIN
docs/_static/screenshots/documents-table.png
vendored
Before Width: | Height: | Size: 137 KiB After Width: | Height: | Size: 480 KiB |
BIN
docs/_static/screenshots/documents-wchrome-dark.png
vendored
Normal file
After Width: | Height: | Size: 680 KiB |
BIN
docs/_static/screenshots/documents-wchrome.png
vendored
Normal file
After Width: | Height: | Size: 686 KiB |
BIN
docs/_static/screenshots/editing.png
vendored
Before Width: | Height: | Size: 293 KiB After Width: | Height: | Size: 848 KiB |
BIN
docs/_static/screenshots/logs.png
vendored
Before Width: | Height: | Size: 260 KiB After Width: | Height: | Size: 703 KiB |
BIN
docs/_static/screenshots/mobile.png
vendored
Before Width: | Height: | Size: 158 KiB After Width: | Height: | Size: 388 KiB |
BIN
docs/_static/screenshots/new-tag.png
vendored
Before Width: | Height: | Size: 32 KiB After Width: | Height: | Size: 26 KiB |
BIN
docs/_static/screenshots/search-preview.png
vendored
Before Width: | Height: | Size: 61 KiB After Width: | Height: | Size: 54 KiB |
BIN
docs/_static/screenshots/search-results.png
vendored
Before Width: | Height: | Size: 261 KiB After Width: | Height: | Size: 517 KiB |
13
docs/_templates/layout.html
vendored
Normal file
@@ -0,0 +1,13 @@
|
||||
{% extends "!layout.html" %}
|
||||
{% block extrahead %}
|
||||
<script>
|
||||
// MediaQueryList object
|
||||
const prefersDarkQuery = window.matchMedia("(prefers-color-scheme: dark)");
|
||||
const lsDark = localStorage.getItem("dark-mode");
|
||||
let darkModeState = lsDark !== null ? lsDark == "true" : prefersDarkQuery.matches;
|
||||
|
||||
document.documentElement.classList.toggle("dark-mode", darkModeState);
|
||||
document.documentElement.classList.toggle("light-mode", !darkModeState);
|
||||
</script>
|
||||
{{ super() }}
|
||||
{% endblock %}
|
@@ -35,13 +35,15 @@ Options available to docker installations:
|
||||
``/var/lib/docker/volumes`` on the host and you need to be root in order
|
||||
to access them.
|
||||
|
||||
Paperless uses 3 volumes:
|
||||
Paperless uses 4 volumes:
|
||||
|
||||
* ``paperless_media``: This is where your documents are stored.
|
||||
* ``paperless_data``: This is where auxillary data is stored. This
|
||||
folder also contains the SQLite database, if you use it.
|
||||
* ``paperless_pgdata``: Exists only if you use PostgreSQL and contains
|
||||
the database.
|
||||
* ``paperless_dbdata``: Exists only if you use MariaDB and contains
|
||||
the database.
|
||||
|
||||
Options available to bare-metal and non-docker installations:
|
||||
|
||||
@@ -49,7 +51,7 @@ Options available to bare-metal and non-docker installations:
|
||||
crashes at some point or your disk fails, you can simply copy the folder back
|
||||
into place and it works.
|
||||
|
||||
When using PostgreSQL, you'll also have to backup the database.
|
||||
When using PostgreSQL or MariaDB, you'll also have to backup the database.
|
||||
|
||||
.. _migrating-restoring:
|
||||
|
||||
@@ -64,9 +66,9 @@ Updating Paperless
|
||||
Docker Route
|
||||
============
|
||||
|
||||
If a new release of paperless-ng is available, upgrading depends on how you
|
||||
installed paperless-ng in the first place. The releases are available at the
|
||||
`release page <https://github.com/jonaswinkler/paperless-ng/releases>`_.
|
||||
If a new release of paperless-ngx is available, upgrading depends on how you
|
||||
installed paperless-ngx in the first place. The releases are available at the
|
||||
`release page <https://github.com/paperless-ngx/paperless-ngx/releases>`_.
|
||||
|
||||
First of all, ensure that paperless is stopped.
|
||||
|
||||
@@ -92,31 +94,47 @@ B. If you built the image yourself, do the following:
|
||||
.. code:: shell-session
|
||||
|
||||
$ git pull
|
||||
$ ./compile-frontend.sh
|
||||
$ docker-compose build
|
||||
$ docker-compose up
|
||||
|
||||
Running ``docker-compose up`` will also apply any new database migrations.
|
||||
If you see everything working, press CTRL+C once to gracefully stop paperless.
|
||||
Then you can start paperless-ng with ``-d`` to have it run in the background.
|
||||
Then you can start paperless-ngx with ``-d`` to have it run in the background.
|
||||
|
||||
.. note::
|
||||
|
||||
In version 0.9.14, the update process was changed. In 0.9.13 and earlier, the
|
||||
docker-compose files specified exact versions and pull won't automatically
|
||||
update to newer versions. In order to enable updates as described above, either
|
||||
get the new ``docker-compose.yml`` file from `here <https://github.com/jonaswinkler/paperless-ng/tree/master/docker/compose>`_
|
||||
get the new ``docker-compose.yml`` file from `here <https://github.com/paperless-ngx/paperless-ngx/tree/master/docker/compose>`_
|
||||
or edit the ``docker-compose.yml`` file, find the line that says
|
||||
|
||||
.. code::
|
||||
|
||||
image: jonaswinkler/paperless-ng:0.9.x
|
||||
image: ghcr.io/paperless-ngx/paperless-ngx:0.9.x
|
||||
|
||||
and replace the version with ``latest``:
|
||||
|
||||
.. code::
|
||||
|
||||
image: jonaswinkler/paperless-ng:latest
|
||||
image: ghcr.io/paperless-ngx/paperless-ngx:latest
|
||||
|
||||
.. note::
|
||||
In version 1.7.1 and onwards, the Docker image can now be pinned to a release series.
|
||||
This is often combined with automatic updaters such as Watchtower to allow safer
|
||||
unattended upgrading to new bugfix releases only. It is still recommended to always
|
||||
review release notes before upgrading. To pin your install to a release series, edit
|
||||
the ``docker-compose.yml`` find the line that says
|
||||
|
||||
.. code::
|
||||
|
||||
image: ghcr.io/paperless-ngx/paperless-ngx:latest
|
||||
|
||||
and replace the version with the series you want to track, for example:
|
||||
|
||||
.. code::
|
||||
|
||||
image: ghcr.io/paperless-ngx/paperless-ngx:1.7
|
||||
|
||||
Bare Metal Route
|
||||
================
|
||||
@@ -144,32 +162,6 @@ After grabbing the new release and unpacking the contents, do the following:
|
||||
This might not actually do anything. Not every new paperless version comes with new
|
||||
database migrations.
|
||||
|
||||
Ansible Route
|
||||
=============
|
||||
|
||||
Most of the update process is automated when using the ansible role.
|
||||
|
||||
1. Update the role to the target release tag to make sure the ansible scripts are compatible:
|
||||
|
||||
.. code:: shell-session
|
||||
|
||||
$ ansible-galaxy install git+https://github.com/jonaswinkler/paperless-ng.git,master --force
|
||||
|
||||
2. Update the role variable definitions ``vars/paperless-ng.yml`` (where appropriate).
|
||||
|
||||
3. Run the ansible playbook you created created during :ref:`installation <setup-ansible>` again:
|
||||
|
||||
.. note::
|
||||
|
||||
When ansible detects that an update run is in progress, it backs up the entire ``paperlessng_directory`` to ``paperlessng_directory-TIMESTAMP``.
|
||||
Updates can be rolled back by simply moving the timestamped folder back to the original location.
|
||||
If the update succeeds and you want to continue using the new release, please don't forget to delete the backup folder.
|
||||
|
||||
.. code:: shell-session
|
||||
|
||||
$ ansible-playbook playbook.yml
|
||||
|
||||
|
||||
Downgrading Paperless
|
||||
#####################
|
||||
|
||||
@@ -297,6 +289,10 @@ When you use the provided docker compose script, put the export inside the
|
||||
``export`` folder in your paperless source directory. Specify ``../export``
|
||||
as the ``source``.
|
||||
|
||||
.. note::
|
||||
|
||||
Importing from a previous version of Paperless may work, but for best results
|
||||
it is suggested to match the versions.
|
||||
|
||||
.. _utilities-retagger:
|
||||
|
||||
@@ -316,6 +312,7 @@ there are tools for it.
|
||||
-c, --correspondent
|
||||
-T, --tags
|
||||
-t, --document_type
|
||||
-s, --storage_path
|
||||
-i, --inbox-only
|
||||
--use-first
|
||||
-f, --overwrite
|
||||
@@ -324,7 +321,7 @@ Run this after changing or adding matching rules. It'll loop over all
|
||||
of the documents in your database and attempt to match documents
|
||||
according to the new rules.
|
||||
|
||||
Specify any combination of ``-c``, ``-T`` and ``-t`` to have the
|
||||
Specify any combination of ``-c``, ``-T``, ``-t`` and ``-s`` to have the
|
||||
retagger perform matching of the specified metadata type. If you don't
|
||||
specify any of these options, the document retagger won't do anything.
|
||||
|
||||
@@ -396,8 +393,8 @@ the naming scheme.
|
||||
|
||||
.. warning::
|
||||
|
||||
Since this command moves you documents around alot, it is advised to to
|
||||
a backup before. The renaming logic is robust and will never overwrite
|
||||
Since this command moves your documents, it is advised to do
|
||||
a backup beforehand. The renaming logic is robust and will never overwrite
|
||||
or delete a file, but you can't ever be careful enough.
|
||||
|
||||
.. code::
|
||||
@@ -406,6 +403,8 @@ the naming scheme.
|
||||
|
||||
The command takes no arguments and processes all your documents at once.
|
||||
|
||||
Learn how to use :ref:`Management Utilities<utilities-management-commands>`.
|
||||
|
||||
|
||||
.. _utilities-sanity-checker:
|
||||
|
||||
@@ -493,7 +492,7 @@ Documents can be stored in Paperless using GnuPG encryption.
|
||||
|
||||
.. danger::
|
||||
|
||||
Encryption is deprecated since paperless-ng 0.9 and doesn't really provide any
|
||||
Encryption is deprecated since paperless-ngx 0.9 and doesn't really provide any
|
||||
additional security, since you have to store the passphrase in a configuration
|
||||
file on the same system as the encrypted documents for paperless to work.
|
||||
Furthermore, the entire text content of the documents is stored plain in the
|
||||
|
@@ -7,25 +7,25 @@ easier.
|
||||
|
||||
.. _advanced-matching:
|
||||
|
||||
Matching tags, correspondents and document types
|
||||
################################################
|
||||
Matching tags, correspondents, document types, and storage paths
|
||||
################################################################
|
||||
|
||||
Paperless will compare the matching algorithms defined by every tag and
|
||||
correspondent already set in your database to see if they apply to the text in
|
||||
a document. In other words, if you defined a tag called ``Home Utility``
|
||||
Paperless will compare the matching algorithms defined by every tag, correspondent,
|
||||
document type, and storage path in your database to see if they apply to the text
|
||||
in a document. In other words, if you define a tag called ``Home Utility``
|
||||
that had a ``match`` property of ``bc hydro`` and a ``matching_algorithm`` of
|
||||
``literal``, Paperless will automatically tag your newly-consumed document with
|
||||
your ``Home Utility`` tag so long as the text ``bc hydro`` appears in the body
|
||||
of the document somewhere.
|
||||
|
||||
The matching logic is quite powerful, and supports searching the text of your
|
||||
The matching logic is quite powerful. It supports searching the text of your
|
||||
document with different algorithms, and as such, some experimentation may be
|
||||
necessary to get things right.
|
||||
|
||||
In order to have a tag, correspondent or type assigned automatically to newly
|
||||
consumed documents, assign a match and matching algorithm using the web
|
||||
interface. These settings define when to assign correspondents, tags and types
|
||||
to documents.
|
||||
In order to have a tag, correspondent, document type, or storage path assigned
|
||||
automatically to newly consumed documents, assign a match and matching algorithm
|
||||
using the web interface. These settings define when to assign tags, correspondents,
|
||||
document types, and storage paths to documents.
|
||||
|
||||
The following algorithms are available:
|
||||
|
||||
@@ -34,22 +34,22 @@ The following algorithms are available:
|
||||
either of these terms.
|
||||
* **All:** Requires that every word provided appears in the PDF, albeit not in the
|
||||
order provided.
|
||||
* **Literal:** Matches only if the match appears exactly as provided in the PDF.
|
||||
* **Literal:** Matches only if the match appears exactly as provided (i.e. preserve ordering) in the PDF.
|
||||
* **Regular expression:** Parses the match as a regular expression and tries to
|
||||
find a match within the document.
|
||||
* **Fuzzy match:** I dont know. Look at the source.
|
||||
* **Fuzzy match:** I don't know. Look at the source.
|
||||
* **Auto:** Tries to automatically match new documents. This does not require you
|
||||
to set a match. See the notes below.
|
||||
|
||||
When using the "any" or "all" matching algorithms, you can search for terms
|
||||
When using the *any* or *all* matching algorithms, you can search for terms
|
||||
that consist of multiple words by enclosing them in double quotes. For example,
|
||||
defining a match text of ``"Bank of America" BofA`` using the "any" algorithm,
|
||||
defining a match text of ``"Bank of America" BofA`` using the *any* algorithm,
|
||||
will match documents that contain either "Bank of America" or "BofA", but will
|
||||
not match documents containing "Bank of South America".
|
||||
|
||||
Then just save your tag/correspondent and run another document through the
|
||||
consumer. Once complete, you should see the newly-created document,
|
||||
automatically tagged with the appropriate data.
|
||||
Then just save your tag, correspondent, document type, or storage path and run
|
||||
another document through the consumer. Once complete, you should see the
|
||||
newly-created document, automatically tagged with the appropriate data.
|
||||
|
||||
|
||||
.. _advanced-automatic_matching:
|
||||
@@ -57,10 +57,10 @@ automatically tagged with the appropriate data.
|
||||
Automatic matching
|
||||
==================
|
||||
|
||||
Paperless-ng comes with a new matching algorithm called *Auto*. This matching
|
||||
algorithm tries to assign tags, correspondents and document types to your
|
||||
documents based on how you have assigned these on existing documents. It
|
||||
uses a neural network under the hood.
|
||||
Paperless-ngx comes with a new matching algorithm called *Auto*. This matching
|
||||
algorithm tries to assign tags, correspondents, document types, and storage paths
|
||||
to your documents based on how you have already assigned these on existing documents.
|
||||
It uses a neural network under the hood.
|
||||
|
||||
If, for example, all your bank statements of your account 123 at the Bank of
|
||||
America are tagged with the tag "bofa_123" and the matching algorithm of this
|
||||
@@ -76,24 +76,25 @@ feature:
|
||||
changes. Paperless periodically (default: once each hour) checks for changes
|
||||
and does this automatically for you.
|
||||
* The Auto matching algorithm only takes documents into account which are NOT
|
||||
placed in your inbox (i.e., have inbox tags assigned to them). This ensures
|
||||
placed in your inbox (i.e. have any inbox tags assigned to them). This ensures
|
||||
that the neural network only learns from documents which you have correctly
|
||||
tagged before.
|
||||
* The matching algorithm can only work if there is a correlation between the
|
||||
tag, correspondent or document type and the document itself. Your bank
|
||||
statements usually contain your bank account number and the name of the bank,
|
||||
so this works reasonably well, However, tags such as "TODO" cannot be
|
||||
automatically assigned.
|
||||
tag, correspondent, document type, or storage path and the document itself.
|
||||
Your bank statements usually contain your bank account number and the name
|
||||
of the bank, so this works reasonably well, However, tags such as "TODO"
|
||||
cannot be automatically assigned.
|
||||
* The matching algorithm needs a reasonable number of documents to identify when
|
||||
to assign tags, correspondents, and types. If one out of a thousand documents
|
||||
has the correspondent "Very obscure web shop I bought something five years
|
||||
ago", it will probably not assign this correspondent automatically if you buy
|
||||
something from them again. The more documents, the better.
|
||||
to assign tags, correspondents, storage paths, and types. If one out of a
|
||||
thousand documents has the correspondent "Very obscure web shop I bought
|
||||
something five years ago", it will probably not assign this correspondent
|
||||
automatically if you buy something from them again. The more documents, the better.
|
||||
* Paperless also needs a reasonable amount of negative examples to decide when
|
||||
not to assign a certain tag, correspondent or type. This will usually be the
|
||||
case as you start filling up paperless with documents. Example: If all your
|
||||
documents are either from "Webshop" and "Bank", paperless will assign one of
|
||||
these correspondents to ANY new document, if both are set to automatic matching.
|
||||
not to assign a certain tag, correspondent, document type, or storage path. This will
|
||||
usually be the case as you start filling up paperless with documents.
|
||||
Example: If all your documents are either from "Webshop" and "Bank", paperless
|
||||
will assign one of these correspondents to ANY new document, if both are set
|
||||
to automatic matching.
|
||||
|
||||
Hooking into the consumption process
|
||||
####################################
|
||||
@@ -104,7 +105,7 @@ you execute scripts of your own choosing just before or after a document is
|
||||
consumed using a couple simple hooks.
|
||||
|
||||
Just write a script, put it somewhere that Paperless can read & execute, and
|
||||
then put the path to that script in ``paperless.conf`` with the variable name
|
||||
then put the path to that script in ``paperless.conf`` or ``docker-compose.env`` with the variable name
|
||||
of either ``PAPERLESS_PRE_CONSUME_SCRIPT`` or
|
||||
``PAPERLESS_POST_CONSUME_SCRIPT``.
|
||||
|
||||
@@ -120,10 +121,10 @@ Pre-consumption script
|
||||
======================
|
||||
|
||||
Executed after the consumer sees a new document in the consumption folder, but
|
||||
before any processing of the document is performed. This script receives exactly
|
||||
one argument:
|
||||
before any processing of the document is performed. This script can access the
|
||||
following relevant environment variables set:
|
||||
|
||||
* Document file name
|
||||
* ``DOCUMENT_SOURCE_PATH``
|
||||
|
||||
A simple but common example for this would be creating a simple script like
|
||||
this:
|
||||
@@ -133,7 +134,7 @@ this:
|
||||
.. code:: bash
|
||||
|
||||
#!/usr/bin/env bash
|
||||
pdf2pdfocr.py -i ${1}
|
||||
pdf2pdfocr.py -i ${DOCUMENT_SOURCE_PATH}
|
||||
|
||||
``/etc/paperless.conf``
|
||||
|
||||
@@ -156,23 +157,57 @@ Post-consumption script
|
||||
=======================
|
||||
|
||||
Executed after the consumer has successfully processed a document and has moved it
|
||||
into paperless. It receives the following arguments:
|
||||
into paperless. It receives the following environment variables:
|
||||
|
||||
* Document id
|
||||
* Generated file name
|
||||
* Source path
|
||||
* Thumbnail path
|
||||
* Download URL
|
||||
* Thumbnail URL
|
||||
* Correspondent
|
||||
* Tags
|
||||
* ``DOCUMENT_ID``
|
||||
* ``DOCUMENT_FILE_NAME``
|
||||
* ``DOCUMENT_CREATED``
|
||||
* ``DOCUMENT_MODIFIED``
|
||||
* ``DOCUMENT_ADDED``
|
||||
* ``DOCUMENT_SOURCE_PATH``
|
||||
* ``DOCUMENT_ARCHIVE_PATH``
|
||||
* ``DOCUMENT_THUMBNAIL_PATH``
|
||||
* ``DOCUMENT_DOWNLOAD_URL``
|
||||
* ``DOCUMENT_THUMBNAIL_URL``
|
||||
* ``DOCUMENT_CORRESPONDENT``
|
||||
* ``DOCUMENT_TAGS``
|
||||
* ``DOCUMENT_ORIGINAL_FILENAME``
|
||||
|
||||
The script can be in any language you like, but for a simple shell script
|
||||
example, you can take a look at ``post-consumption-example.sh`` in the
|
||||
``scripts`` directory in this project.
|
||||
The script can be in any language, but for a simple shell script
|
||||
example, you can take a look at `post-consumption-example.sh`_ in this project.
|
||||
|
||||
The post consumption script cannot cancel the consumption process.
|
||||
|
||||
Docker
|
||||
------
|
||||
Assumed you have ``/home/foo/paperless-ngx/scripts/post-consumption-example.sh``.
|
||||
|
||||
You can pass that script into the consumer container via a host mount in your ``docker-compose.yml``.
|
||||
|
||||
.. code:: bash
|
||||
|
||||
...
|
||||
consumer:
|
||||
...
|
||||
volumes:
|
||||
...
|
||||
- /home/paperless-ngx/scripts:/path/in/container/scripts/
|
||||
...
|
||||
|
||||
Example (docker-compose.yml): ``- /home/foo/paperless-ngx/scripts:/usr/src/paperless/scripts``
|
||||
|
||||
which in turn requires the variable ``PAPERLESS_POST_CONSUME_SCRIPT`` in ``docker-compose.env`` to point to ``/path/in/container/scripts/post-consumption-example.sh``.
|
||||
|
||||
Example (docker-compose.env): ``PAPERLESS_POST_CONSUME_SCRIPT=/usr/src/paperless/scripts/post-consumption-example.sh``
|
||||
|
||||
Troubleshooting:
|
||||
|
||||
- Monitor the docker-compose log ``cd ~/paperless-ngx; docker-compose logs -f``
|
||||
- Check your script's permission e.g. in case of permission error ``sudo chmod 755 post-consumption-example.sh``
|
||||
- Pipe your scripts's output to a log file e.g. ``echo "${DOCUMENT_ID}" | tee --append /usr/src/paperless/scripts/post-consumption-example.log``
|
||||
|
||||
.. _post-consumption-example.sh: https://github.com/paperless-ngx/paperless-ngx/blob/main/scripts/post-consumption-example.sh
|
||||
|
||||
.. _advanced-file_name_handling:
|
||||
|
||||
File name handling
|
||||
@@ -183,7 +218,8 @@ using the identifier which it has assigned to each document. You will end up get
|
||||
files like ``0000123.pdf`` in your media directory. This isn't necessarily a bad
|
||||
thing, because you normally don't have to access these files manually. However, if
|
||||
you wish to name your files differently, you can do that by adjusting the
|
||||
``PAPERLESS_FILENAME_FORMAT`` configuration option.
|
||||
``PAPERLESS_FILENAME_FORMAT`` configuration option. Paperless adds the correct
|
||||
file extension e.g. ``.pdf``, ``.jpg`` automatically.
|
||||
|
||||
This variable allows you to configure the filename (folders are allowed) using
|
||||
placeholders. For example, configuring this to
|
||||
@@ -214,21 +250,21 @@ will create a directory structure as follows:
|
||||
last filename a document was stored as. If you do rename a file, paperless will
|
||||
report your files as missing and won't be able to find them.
|
||||
|
||||
Paperless provides the following placeholders withing filenames:
|
||||
Paperless provides the following placeholders within filenames:
|
||||
|
||||
* ``{asn}``: The archive serial number of the document, or "none".
|
||||
* ``{correspondent}``: The name of the correspondent, or "none".
|
||||
* ``{document_type}``: The name of the document type, or "none".
|
||||
* ``{tag_list}``: A comma separated list of all tags assigned to the document.
|
||||
* ``{title}``: The title of the document.
|
||||
* ``{created}``: The full date and time the document was created.
|
||||
* ``{created}``: The full date (ISO format) the document was created.
|
||||
* ``{created_year}``: Year created only.
|
||||
* ``{created_month}``: Month created only (number 1-12).
|
||||
* ``{created_day}``: Day created only (number 1-31).
|
||||
* ``{added}``: The full date and time the document was added to paperless.
|
||||
* ``{created_month}``: Month created only (number 01-12).
|
||||
* ``{created_day}``: Day created only (number 01-31).
|
||||
* ``{added}``: The full date (ISO format) the document was added to paperless.
|
||||
* ``{added_year}``: Year added only.
|
||||
* ``{added_month}``: Month added only (number 1-12).
|
||||
* ``{added_day}``: Day added only (number 1-31).
|
||||
* ``{added_month}``: Month added only (number 01-12).
|
||||
* ``{added_day}``: Day added only (number 01-31).
|
||||
|
||||
|
||||
Paperless will try to conserve the information from your database as much as possible.
|
||||
@@ -239,6 +275,17 @@ If paperless detects that two documents share the same filename, paperless will
|
||||
append ``_01``, ``_02``, etc to the filename. This happens if all the placeholders in a filename
|
||||
evaluate to the same value.
|
||||
|
||||
.. hint::
|
||||
You can affect how empty placeholders are treated by changing the following setting to
|
||||
`true`.
|
||||
|
||||
.. code::
|
||||
|
||||
PAPERLESS_FILENAME_FORMAT_REMOVE_NONE=True
|
||||
|
||||
Doing this results in all empty placeholders resolving to "" instead of "none" as stated above.
|
||||
Spaces before empty placeholders are removed as well, empty directories are omitted.
|
||||
|
||||
.. hint::
|
||||
|
||||
Paperless checks the filename of a document whenever it is saved. Therefore,
|
||||
@@ -261,3 +308,59 @@ evaluate to the same value.
|
||||
|
||||
However, keep in mind that inside docker, if files get stored outside of the
|
||||
predefined volumes, they will be lost after a restart of paperless.
|
||||
|
||||
|
||||
Storage paths
|
||||
#############
|
||||
|
||||
One of the best things in Paperless is that you can not only access the documents via the
|
||||
web interface, but also via the file system.
|
||||
|
||||
When as single storage layout is not sufficient for your use case, storage paths come to
|
||||
the rescue. Storage paths allow you to configure more precisely where each document is stored
|
||||
in the file system.
|
||||
|
||||
- Each storage path is a `PAPERLESS_FILENAME_FORMAT` and follows the rules described above
|
||||
- Each document is assigned a storage path using the matching algorithms described above, but
|
||||
can be overwritten at any time
|
||||
|
||||
For example, you could define the following two storage paths:
|
||||
|
||||
1. Normal communications are put into a folder structure sorted by `year/correspondent`
|
||||
2. Communications with insurance companies are stored in a flat structure with longer file names,
|
||||
but containing the full date of the correspondence.
|
||||
|
||||
.. code::
|
||||
|
||||
By Year = {created_year}/{correspondent}/{title}
|
||||
Insurances = Insurances/{correspondent}/{created_year}-{created_month}-{created_day} {title}
|
||||
|
||||
|
||||
If you then map these storage paths to the documents, you might get the following result.
|
||||
For simplicity, `By Year` defines the same structure as in the previous example above.
|
||||
|
||||
.. code:: text
|
||||
|
||||
2019/ # By Year
|
||||
My bank/
|
||||
Statement January.pdf
|
||||
Statement February.pdf
|
||||
|
||||
Insurances/ # Insurances
|
||||
Healthcare 123/
|
||||
2022-01-01 Statement January.pdf
|
||||
2022-02-02 Letter.pdf
|
||||
2022-02-03 Letter.pdf
|
||||
Dental 456/
|
||||
2021-12-01 New Conditions.pdf
|
||||
|
||||
|
||||
.. hint::
|
||||
|
||||
Defining a storage path is optional. If no storage path is defined for a document, the global
|
||||
`PAPERLESS_FILENAME_FORMAT` is applied.
|
||||
|
||||
.. caution::
|
||||
|
||||
If you adjust the format of an existing storage path, old documents don't get relocated automatically.
|
||||
You need to run the :ref:`document renamer <utilities-renamer>` to adjust their pathes.
|
||||
|
@@ -31,7 +31,8 @@ The objects served by the document endpoint contain the following fields:
|
||||
* ``tags``: List of IDs of tags assigned to this document, or empty list.
|
||||
* ``document_type``: Document type of this document, or null.
|
||||
* ``correspondent``: Correspondent of this document or null.
|
||||
* ``created``: The date at which this document was created.
|
||||
* ``created``: The date time at which this document was created.
|
||||
* ``created_date``: The date (YYYY-MM-DD) at which this document was created. Optional. If also passed with created, this is ignored.
|
||||
* ``modified``: The date at which this document was last edited in paperless. Read-only.
|
||||
* ``added``: The date at which this document was added to paperless. Read-only.
|
||||
* ``archive_serial_number``: The identifier of this document in a physical document archive.
|
||||
@@ -60,7 +61,7 @@ The endpoints correctly serve the response header fields ``Content-Disposition``
|
||||
and ``Content-Type`` to indicate the filename for download and the type of content of
|
||||
the document.
|
||||
|
||||
In order to download or preview the original document when an archied document is available,
|
||||
In order to download or preview the original document when an archived document is available,
|
||||
supply the query parameter ``original=true``.
|
||||
|
||||
.. hint::
|
||||
@@ -240,11 +241,13 @@ be instructed to consume the document from there.
|
||||
The endpoint supports the following optional form fields:
|
||||
|
||||
* ``title``: Specify a title that the consumer should use for the document.
|
||||
* ``created``: Specify a DateTime where the document was created (e.g. "2016-04-19" or "2016-04-19 06:15:00+02:00").
|
||||
* ``correspondent``: Specify the ID of a correspondent that the consumer should use for the document.
|
||||
* ``document_type``: Similar to correspondent.
|
||||
* ``tags``: Similar to correspondent. Specify this multiple times to have multiple tags added
|
||||
to the document.
|
||||
|
||||
|
||||
The endpoint will immediately return "OK" if the document consumption process
|
||||
was started successfully. No additional status information about the consumption
|
||||
process itself is available, since that happens in a different process.
|
||||
@@ -255,7 +258,7 @@ process itself is available, since that happens in a different process.
|
||||
API Versioning
|
||||
##############
|
||||
|
||||
The REST API is versioned since Paperless-ng 1.3.0.
|
||||
The REST API is versioned since Paperless-ngx 1.3.0.
|
||||
|
||||
* Versioning ensures that changes to the API don't break older clients.
|
||||
* Clients specify the specific version of the API they wish to use with every request and Paperless will handle the request using the specified API version.
|
||||
|
2263
docs/changelog.md
Normal file
1597
docs/changelog.rst
210
docs/conf.py
@@ -2,33 +2,39 @@ import sphinx_rtd_theme
|
||||
|
||||
|
||||
__version__ = None
|
||||
__full_version_str__ = None
|
||||
__major_minor_version_str__ = None
|
||||
exec(open("../src/paperless/version.py").read())
|
||||
|
||||
|
||||
extensions = [
|
||||
'sphinx.ext.autodoc',
|
||||
'sphinx.ext.intersphinx',
|
||||
'sphinx.ext.todo',
|
||||
'sphinx.ext.imgmath',
|
||||
'sphinx.ext.viewcode',
|
||||
'sphinx_rtd_theme',
|
||||
"sphinx.ext.autodoc",
|
||||
"sphinx.ext.intersphinx",
|
||||
"sphinx.ext.todo",
|
||||
"sphinx.ext.imgmath",
|
||||
"sphinx.ext.viewcode",
|
||||
"sphinx_rtd_theme",
|
||||
"myst_parser",
|
||||
]
|
||||
|
||||
# Add any paths that contain templates here, relative to this directory.
|
||||
# templates_path = ['_templates']
|
||||
templates_path = ["_templates"]
|
||||
|
||||
# The suffix of source filenames.
|
||||
source_suffix = '.rst'
|
||||
source_suffix = {
|
||||
".rst": "restructuredtext",
|
||||
".md": "markdown",
|
||||
}
|
||||
|
||||
# The encoding of source files.
|
||||
#source_encoding = 'utf-8-sig'
|
||||
# source_encoding = 'utf-8-sig'
|
||||
|
||||
# The master toctree document.
|
||||
master_doc = 'index'
|
||||
master_doc = "index"
|
||||
|
||||
# General information about the project.
|
||||
project = u'Paperless-ng'
|
||||
copyright = u'2021, Daniel Quinn, Jonas Winkler'
|
||||
project = "Paperless-ngx"
|
||||
copyright = "2015-2022, Daniel Quinn, Jonas Winkler, and the paperless-ngx team"
|
||||
|
||||
# The version info for the project you're documenting, acts as replacement for
|
||||
# |version| and |release|, also used in various other places throughout the
|
||||
@@ -41,186 +47,190 @@ copyright = u'2021, Daniel Quinn, Jonas Winkler'
|
||||
#
|
||||
|
||||
# The short X.Y version.
|
||||
version = ".".join([str(_) for _ in __version__[:2]])
|
||||
version = __major_minor_version_str__
|
||||
# The full version, including alpha/beta/rc tags.
|
||||
release = ".".join([str(_) for _ in __version__[:3]])
|
||||
release = __full_version_str__
|
||||
|
||||
# The language for content autogenerated by Sphinx. Refer to documentation
|
||||
# for a list of supported languages.
|
||||
#language = None
|
||||
# language = None
|
||||
|
||||
# There are two options for replacing |today|: either, you set today to some
|
||||
# non-false value, then it is used:
|
||||
#today = ''
|
||||
# today = ''
|
||||
# Else, today_fmt is used as the format for a strftime call.
|
||||
#today_fmt = '%B %d, %Y'
|
||||
# today_fmt = '%B %d, %Y'
|
||||
|
||||
# List of patterns, relative to source directory, that match files and
|
||||
# directories to ignore when looking for source files.
|
||||
exclude_patterns = ['_build']
|
||||
exclude_patterns = ["_build"]
|
||||
|
||||
# The reST default role (used for this markup: `text`) to use for all
|
||||
# documents.
|
||||
#default_role = None
|
||||
# default_role = None
|
||||
|
||||
# If true, '()' will be appended to :func: etc. cross-reference text.
|
||||
#add_function_parentheses = True
|
||||
# add_function_parentheses = True
|
||||
|
||||
# If true, the current module name will be prepended to all description
|
||||
# unit titles (such as .. function::).
|
||||
#add_module_names = True
|
||||
# add_module_names = True
|
||||
|
||||
# If true, sectionauthor and moduleauthor directives will be shown in the
|
||||
# output. They are ignored by default.
|
||||
#show_authors = False
|
||||
# show_authors = False
|
||||
|
||||
# The name of the Pygments (syntax highlighting) style to use.
|
||||
pygments_style = 'sphinx'
|
||||
pygments_style = "sphinx"
|
||||
|
||||
# A list of ignored prefixes for module index sorting.
|
||||
#modindex_common_prefix = []
|
||||
# modindex_common_prefix = []
|
||||
|
||||
# If true, keep warnings as "system message" paragraphs in the built documents.
|
||||
#keep_warnings = False
|
||||
# keep_warnings = False
|
||||
|
||||
|
||||
# -- Options for HTML output ----------------------------------------------
|
||||
|
||||
# The theme to use for HTML and HTML Help pages. See the documentation for
|
||||
# a list of builtin themes.
|
||||
html_theme = 'sphinx_rtd_theme'
|
||||
html_theme = "sphinx_rtd_theme"
|
||||
|
||||
# Theme options are theme-specific and customize the look and feel of a theme
|
||||
# further. For a list of options available for each theme, see the
|
||||
# documentation.
|
||||
#html_theme_options = {}
|
||||
# html_theme_options = {}
|
||||
|
||||
# Add any paths that contain custom themes here, relative to this directory.
|
||||
html_theme_path = []
|
||||
|
||||
# The name for this set of Sphinx documents. If None, it defaults to
|
||||
# "<project> v<release> documentation".
|
||||
#html_title = None
|
||||
# html_title = None
|
||||
|
||||
# A shorter title for the navigation bar. Default is the same as html_title.
|
||||
#html_short_title = None
|
||||
# html_short_title = None
|
||||
|
||||
# The name of an image file (relative to this directory) to place at the top
|
||||
# of the sidebar.
|
||||
#html_logo = None
|
||||
# html_logo = None
|
||||
|
||||
# The name of an image file (within the static path) to use as favicon of the
|
||||
# docs. This file should be a Windows icon file (.ico) being 16x16 or 32x32
|
||||
# pixels large.
|
||||
#html_favicon = None
|
||||
# html_favicon = None
|
||||
|
||||
# Add any paths that contain custom static files (such as style sheets) here,
|
||||
# relative to this directory. They are copied after the builtin static files,
|
||||
# so a file named "default.css" will overwrite the builtin "default.css".
|
||||
html_static_path = ['_static']
|
||||
html_static_path = ["_static"]
|
||||
|
||||
# These paths are either relative to html_static_path
|
||||
# or fully qualified paths (eg. https://...)
|
||||
html_css_files = [
|
||||
"css/custom.css",
|
||||
]
|
||||
|
||||
html_js_files = [
|
||||
"js/darkmode.js",
|
||||
]
|
||||
|
||||
# Add any extra paths that contain custom files (such as robots.txt or
|
||||
# .htaccess) here, relative to this directory. These files are copied
|
||||
# directly to the root of the documentation.
|
||||
#html_extra_path = []
|
||||
# html_extra_path = []
|
||||
|
||||
# If not '', a 'Last updated on:' timestamp is inserted at every page bottom,
|
||||
# using the given strftime format.
|
||||
#html_last_updated_fmt = '%b %d, %Y'
|
||||
# html_last_updated_fmt = '%b %d, %Y'
|
||||
|
||||
# If true, SmartyPants will be used to convert quotes and dashes to
|
||||
# typographically correct entities.
|
||||
#html_use_smartypants = True
|
||||
# html_use_smartypants = True
|
||||
|
||||
# Custom sidebar templates, maps document names to template names.
|
||||
#html_sidebars = {}
|
||||
# html_sidebars = {}
|
||||
|
||||
# Additional templates that should be rendered to pages, maps page names to
|
||||
# template names.
|
||||
#html_additional_pages = {}
|
||||
# html_additional_pages = {}
|
||||
|
||||
# If false, no module index is generated.
|
||||
#html_domain_indices = True
|
||||
# html_domain_indices = True
|
||||
|
||||
# If false, no index is generated.
|
||||
#html_use_index = True
|
||||
# html_use_index = True
|
||||
|
||||
# If true, the index is split into individual pages for each letter.
|
||||
#html_split_index = False
|
||||
# html_split_index = False
|
||||
|
||||
# If true, links to the reST sources are added to the pages.
|
||||
#html_show_sourcelink = True
|
||||
# html_show_sourcelink = True
|
||||
|
||||
# If true, "Created using Sphinx" is shown in the HTML footer. Default is True.
|
||||
#html_show_sphinx = True
|
||||
# html_show_sphinx = True
|
||||
|
||||
# If true, "(C) Copyright ..." is shown in the HTML footer. Default is True.
|
||||
#html_show_copyright = True
|
||||
# html_show_copyright = True
|
||||
|
||||
# If true, an OpenSearch description file will be output, and all pages will
|
||||
# contain a <link> tag referring to it. The value of this option must be the
|
||||
# base URL from which the finished HTML is served.
|
||||
#html_use_opensearch = ''
|
||||
# html_use_opensearch = ''
|
||||
|
||||
# This is the file name suffix for HTML files (e.g. ".xhtml").
|
||||
#html_file_suffix = None
|
||||
# html_file_suffix = None
|
||||
|
||||
# Output file base name for HTML help builder.
|
||||
htmlhelp_basename = 'paperless'
|
||||
htmlhelp_basename = "paperless"
|
||||
|
||||
# -- Options for LaTeX output ---------------------------------------------
|
||||
|
||||
latex_elements = {
|
||||
# The paper size ('letterpaper' or 'a4paper').
|
||||
#'papersize': 'letterpaper',
|
||||
|
||||
# The font size ('10pt', '11pt' or '12pt').
|
||||
#'pointsize': '10pt',
|
||||
|
||||
# Additional stuff for the LaTeX preamble.
|
||||
#'preamble': '',
|
||||
# The paper size ('letterpaper' or 'a4paper').
|
||||
#'papersize': 'letterpaper',
|
||||
# The font size ('10pt', '11pt' or '12pt').
|
||||
#'pointsize': '10pt',
|
||||
# Additional stuff for the LaTeX preamble.
|
||||
#'preamble': '',
|
||||
}
|
||||
|
||||
# Grouping the document tree into LaTeX files. List of tuples
|
||||
# (source start file, target name, title,
|
||||
# author, documentclass [howto, manual, or own class]).
|
||||
latex_documents = [
|
||||
('index', 'paperless.tex', u'Paperless Documentation',
|
||||
u'Daniel Quinn', 'manual'),
|
||||
("index", "paperless.tex", "Paperless Documentation", "Daniel Quinn", "manual"),
|
||||
]
|
||||
|
||||
# The name of an image file (relative to this directory) to place at the top of
|
||||
# the title page.
|
||||
#latex_logo = None
|
||||
# latex_logo = None
|
||||
|
||||
# For "manual" documents, if this is true, then toplevel headings are parts,
|
||||
# not chapters.
|
||||
#latex_use_parts = False
|
||||
# latex_use_parts = False
|
||||
|
||||
# If true, show page references after internal links.
|
||||
#latex_show_pagerefs = False
|
||||
# latex_show_pagerefs = False
|
||||
|
||||
# If true, show URL addresses after external links.
|
||||
#latex_show_urls = False
|
||||
# latex_show_urls = False
|
||||
|
||||
# Documents to append as an appendix to all manuals.
|
||||
#latex_appendices = []
|
||||
# latex_appendices = []
|
||||
|
||||
# If false, no module index is generated.
|
||||
#latex_domain_indices = True
|
||||
# latex_domain_indices = True
|
||||
|
||||
|
||||
# -- Options for manual page output ---------------------------------------
|
||||
|
||||
# One entry per manual page. List of tuples
|
||||
# (source start file, name, description, authors, manual section).
|
||||
man_pages = [
|
||||
('index', 'paperless', u'Paperless Documentation',
|
||||
[u'Daniel Quinn'], 1)
|
||||
]
|
||||
man_pages = [("index", "paperless", "Paperless Documentation", ["Daniel Quinn"], 1)]
|
||||
|
||||
# If true, show URL addresses after external links.
|
||||
#man_show_urls = False
|
||||
# man_show_urls = False
|
||||
|
||||
|
||||
# -- Options for Texinfo output -------------------------------------------
|
||||
@@ -229,93 +239,99 @@ man_pages = [
|
||||
# (source start file, target name, title, author,
|
||||
# dir menu entry, description, category)
|
||||
texinfo_documents = [
|
||||
('index', 'Paperless', u'Paperless Documentation',
|
||||
u'Daniel Quinn', 'paperless', 'Scan, index, and archive all of your paper documents.',
|
||||
'Miscellaneous'),
|
||||
(
|
||||
"index",
|
||||
"Paperless",
|
||||
"Paperless Documentation",
|
||||
"Daniel Quinn",
|
||||
"paperless",
|
||||
"Scan, index, and archive all of your paper documents.",
|
||||
"Miscellaneous",
|
||||
),
|
||||
]
|
||||
|
||||
# Documents to append as an appendix to all manuals.
|
||||
#texinfo_appendices = []
|
||||
# texinfo_appendices = []
|
||||
|
||||
# If false, no module index is generated.
|
||||
#texinfo_domain_indices = True
|
||||
# texinfo_domain_indices = True
|
||||
|
||||
# How to display URL addresses: 'footnote', 'no', or 'inline'.
|
||||
#texinfo_show_urls = 'footnote'
|
||||
# texinfo_show_urls = 'footnote'
|
||||
|
||||
# If true, do not generate a @detailmenu in the "Top" node's menu.
|
||||
#texinfo_no_detailmenu = False
|
||||
# texinfo_no_detailmenu = False
|
||||
|
||||
|
||||
# -- Options for Epub output ----------------------------------------------
|
||||
|
||||
# Bibliographic Dublin Core info.
|
||||
epub_title = u'Paperless'
|
||||
epub_author = u'Daniel Quinn'
|
||||
epub_publisher = u'Daniel Quinn'
|
||||
epub_copyright = u'2015, Daniel Quinn'
|
||||
epub_title = "Paperless"
|
||||
epub_author = "Daniel Quinn"
|
||||
epub_publisher = "Daniel Quinn"
|
||||
epub_copyright = "2015, Daniel Quinn"
|
||||
|
||||
# The basename for the epub file. It defaults to the project name.
|
||||
#epub_basename = u'Paperless'
|
||||
# epub_basename = u'Paperless'
|
||||
|
||||
# The HTML theme for the epub output. Since the default themes are not optimized
|
||||
# for small screen space, using the same theme for HTML and epub output is
|
||||
# usually not wise. This defaults to 'epub', a theme designed to save visual
|
||||
# space.
|
||||
#epub_theme = 'epub'
|
||||
# epub_theme = 'epub'
|
||||
|
||||
# The language of the text. It defaults to the language option
|
||||
# or en if the language is not set.
|
||||
#epub_language = ''
|
||||
# epub_language = ''
|
||||
|
||||
# The scheme of the identifier. Typical schemes are ISBN or URL.
|
||||
#epub_scheme = ''
|
||||
# epub_scheme = ''
|
||||
|
||||
# The unique identifier of the text. This can be a ISBN number
|
||||
# or the project homepage.
|
||||
#epub_identifier = ''
|
||||
# epub_identifier = ''
|
||||
|
||||
# A unique identification for the text.
|
||||
#epub_uid = ''
|
||||
# epub_uid = ''
|
||||
|
||||
# A tuple containing the cover image and cover page html template filenames.
|
||||
#epub_cover = ()
|
||||
# epub_cover = ()
|
||||
|
||||
# A sequence of (type, uri, title) tuples for the guide element of content.opf.
|
||||
#epub_guide = ()
|
||||
# epub_guide = ()
|
||||
|
||||
# HTML files that should be inserted before the pages created by sphinx.
|
||||
# The format is a list of tuples containing the path and title.
|
||||
#epub_pre_files = []
|
||||
# epub_pre_files = []
|
||||
|
||||
# HTML files shat should be inserted after the pages created by sphinx.
|
||||
# The format is a list of tuples containing the path and title.
|
||||
#epub_post_files = []
|
||||
# epub_post_files = []
|
||||
|
||||
# A list of files that should not be packed into the epub file.
|
||||
epub_exclude_files = ['search.html']
|
||||
epub_exclude_files = ["search.html"]
|
||||
|
||||
# The depth of the table of contents in toc.ncx.
|
||||
#epub_tocdepth = 3
|
||||
# epub_tocdepth = 3
|
||||
|
||||
# Allow duplicate toc entries.
|
||||
#epub_tocdup = True
|
||||
# epub_tocdup = True
|
||||
|
||||
# Choose between 'default' and 'includehidden'.
|
||||
#epub_tocscope = 'default'
|
||||
# epub_tocscope = 'default'
|
||||
|
||||
# Fix unsupported image types using the PIL.
|
||||
#epub_fix_images = False
|
||||
# epub_fix_images = False
|
||||
|
||||
# Scale large images.
|
||||
#epub_max_image_width = 0
|
||||
# epub_max_image_width = 0
|
||||
|
||||
# How to display URL addresses: 'footnote', 'no', or 'inline'.
|
||||
#epub_show_urls = 'inline'
|
||||
# epub_show_urls = 'inline'
|
||||
|
||||
# If false, no index is generated.
|
||||
#epub_use_index = True
|
||||
# epub_use_index = True
|
||||
|
||||
|
||||
# Example configuration for intersphinx: refer to the Python standard library.
|
||||
intersphinx_mapping = {'http://docs.python.org/': None}
|
||||
intersphinx_mapping = {"http://docs.python.org/": None}
|
||||
|
@@ -27,11 +27,23 @@ PAPERLESS_REDIS=<url>
|
||||
This is required for processing scheduled tasks such as email fetching, index
|
||||
optimization and for training the automatic document matcher.
|
||||
|
||||
* If your Redis server needs login credentials PAPERLESS_REDIS = ``redis://<username>:<password>@<host>:<port>``
|
||||
|
||||
* With the requirepass option PAPERLESS_REDIS = ``redis://:<password>@<host>:<port>``
|
||||
|
||||
`More information on securing your Redis Instance <https://redis.io/docs/getting-started/#securing-redis>`_.
|
||||
|
||||
Defaults to redis://localhost:6379.
|
||||
|
||||
PAPERLESS_DBENGINE=<engine_name>
|
||||
Optional, gives the ability to choose Postgres or MariaDB for database engine.
|
||||
Available options are `postgresql` and `mariadb`.
|
||||
Default is `postgresql`.
|
||||
|
||||
PAPERLESS_DBHOST=<hostname>
|
||||
By default, sqlite is used as the database backend. This can be changed here.
|
||||
Set PAPERLESS_DBHOST and PostgreSQL will be used instead of mysql.
|
||||
|
||||
Set PAPERLESS_DBHOST and another database will be used instead of sqlite.
|
||||
|
||||
PAPERLESS_DBPORT=<port>
|
||||
Adjust port if necessary.
|
||||
@@ -39,17 +51,17 @@ PAPERLESS_DBPORT=<port>
|
||||
Default is 5432.
|
||||
|
||||
PAPERLESS_DBNAME=<name>
|
||||
Database name in PostgreSQL.
|
||||
Database name in PostgreSQL or MariaDB.
|
||||
|
||||
Defaults to "paperless".
|
||||
|
||||
PAPERLESS_DBUSER=<name>
|
||||
Database user in PostgreSQL.
|
||||
Database user in PostgreSQL or MariaDB.
|
||||
|
||||
Defaults to "paperless".
|
||||
|
||||
PAPERLESS_DBPASS=<password>
|
||||
Database password for PostgreSQL.
|
||||
Database password for PostgreSQL or MariaDB.
|
||||
|
||||
Defaults to "paperless".
|
||||
|
||||
@@ -60,6 +72,13 @@ PAPERLESS_DBSSLMODE=<mode>
|
||||
|
||||
Default is ``prefer``.
|
||||
|
||||
PAPERLESS_DB_TIMEOUT=<float>
|
||||
Amount of time for a database connection to wait for the database to unlock.
|
||||
Mostly applicable for an sqlite based installation, consider changing to postgresql
|
||||
if you need to increase this.
|
||||
|
||||
Defaults to unset, keeping the Django defaults.
|
||||
|
||||
Paths and folders
|
||||
#################
|
||||
|
||||
@@ -80,6 +99,15 @@ PAPERLESS_DATA_DIR=<path>
|
||||
|
||||
Defaults to "../data/", relative to the "src" directory.
|
||||
|
||||
PAPERLESS_TRASH_DIR=<path>
|
||||
Instead of removing deleted documents, they are moved to this directory.
|
||||
|
||||
This must be writeable by the user running paperless. When running inside
|
||||
docker, ensure that this path is within a permanent volume (such as
|
||||
"../media/trash") so it won't get lost on upgrades.
|
||||
|
||||
Defaults to empty (i.e. really delete documents).
|
||||
|
||||
PAPERLESS_MEDIA_ROOT=<path>
|
||||
This is where your documents and thumbnails are stored.
|
||||
|
||||
@@ -102,6 +130,14 @@ PAPERLESS_FILENAME_FORMAT=<format>
|
||||
|
||||
Default is none, which disables this feature.
|
||||
|
||||
PAPERLESS_FILENAME_FORMAT_REMOVE_NONE=<bool>
|
||||
Tells paperless to replace placeholders in `PAPERLESS_FILENAME_FORMAT` that would resolve
|
||||
to 'none' to be omitted from the resulting filename. This also holds true for directory
|
||||
names.
|
||||
See :ref:`advanced-file_name_handling` for details.
|
||||
|
||||
Defaults to `false` which disables this feature.
|
||||
|
||||
PAPERLESS_LOGGING_DIR=<path>
|
||||
This is where paperless will store log files.
|
||||
|
||||
@@ -121,6 +157,8 @@ PAPERLESS_LOGROTATE_MAX_BACKUPS=<num>
|
||||
|
||||
Defaults to 20.
|
||||
|
||||
.. _hosting-and-security:
|
||||
|
||||
Hosting & Security
|
||||
##################
|
||||
|
||||
@@ -133,7 +171,24 @@ PAPERLESS_SECRET_KEY=<key>
|
||||
|
||||
Default is listed in the file ``src/paperless/settings.py``.
|
||||
|
||||
PAPERLESS_ALLOWED_HOSTS<comma-separated-list>
|
||||
PAPERLESS_URL=<url>
|
||||
This setting can be used to set the three options below (ALLOWED_HOSTS,
|
||||
CORS_ALLOWED_HOSTS and CSRF_TRUSTED_ORIGINS). If the other options are
|
||||
set the values will be combined with this one. Do not include a trailing
|
||||
slash. E.g. https://paperless.domain.com
|
||||
|
||||
Defaults to empty string, leaving the other settings unaffected.
|
||||
|
||||
PAPERLESS_CSRF_TRUSTED_ORIGINS=<comma-separated-list>
|
||||
A list of trusted origins for unsafe requests (e.g. POST). As of Django 4.0
|
||||
this is required to access the Django admin via the web.
|
||||
See https://docs.djangoproject.com/en/4.0/ref/settings/#csrf-trusted-origins
|
||||
|
||||
Can also be set using PAPERLESS_URL (see above).
|
||||
|
||||
Defaults to empty string, which does not add any origins to the trusted list.
|
||||
|
||||
PAPERLESS_ALLOWED_HOSTS=<comma-separated-list>
|
||||
If you're planning on putting Paperless on the open internet, then you
|
||||
really should set this value to the domain name you're using. Failing to do
|
||||
so leaves you open to HTTP host header attacks:
|
||||
@@ -142,12 +197,19 @@ PAPERLESS_ALLOWED_HOSTS<comma-separated-list>
|
||||
Just remember that this is a comma-separated list, so "example.com" is fine,
|
||||
as is "example.com,www.example.com", but NOT " example.com" or "example.com,"
|
||||
|
||||
Can also be set using PAPERLESS_URL (see above).
|
||||
|
||||
If manually set, please remember to include "localhost". Otherwise docker
|
||||
healthcheck will fail.
|
||||
|
||||
Defaults to "*", which is all hosts.
|
||||
|
||||
PAPERLESS_CORS_ALLOWED_HOSTS<comma-separated-list>
|
||||
PAPERLESS_CORS_ALLOWED_HOSTS=<comma-separated-list>
|
||||
You need to add your servers to the list of allowed hosts that can do CORS
|
||||
calls. Set this to your public domain name.
|
||||
|
||||
Can also be set using PAPERLESS_URL (see above).
|
||||
|
||||
Defaults to "http://localhost:8000".
|
||||
|
||||
PAPERLESS_FORCE_SCRIPT_NAME=<path>
|
||||
@@ -159,9 +221,16 @@ PAPERLESS_FORCE_SCRIPT_NAME=<path>
|
||||
PAPERLESS_STATIC_URL=<path>
|
||||
Override the STATIC_URL here. Unless you're hosting Paperless off a
|
||||
subdomain like /paperless/, you probably don't need to change this.
|
||||
If you do change it, be sure to include the trailing slash.
|
||||
|
||||
Defaults to "/static/".
|
||||
|
||||
.. note::
|
||||
|
||||
When hosting paperless behind a reverse proxy like Traefik or Nginx at a subpath e.g.
|
||||
example.com/paperlessngx you will also need to set ``PAPERLESS_FORCE_SCRIPT_NAME``
|
||||
(see above).
|
||||
|
||||
PAPERLESS_AUTO_LOGIN_USERNAME=<username>
|
||||
Specify a username here so that paperless will automatically perform login
|
||||
with the selected user.
|
||||
@@ -176,7 +245,7 @@ PAPERLESS_AUTO_LOGIN_USERNAME=<username>
|
||||
PAPERLESS_ADMIN_USER=<username>
|
||||
If this environment variable is specified, Paperless automatically creates
|
||||
a superuser with the provided username at start. This is useful in cases
|
||||
where you can not run the `createsuperuser` command seperately, such as Kubernetes
|
||||
where you can not run the `createsuperuser` command separately, such as Kubernetes
|
||||
or AWS ECS.
|
||||
|
||||
Requires `PAPERLESS_ADMIN_PASSWORD` to be set.
|
||||
@@ -233,6 +302,13 @@ PAPERLESS_HTTP_REMOTE_USER_HEADER_NAME=<str>
|
||||
|
||||
Defaults to `HTTP_REMOTE_USER`.
|
||||
|
||||
PAPERLESS_LOGOUT_REDIRECT_URL=<str>
|
||||
URL to redirect the user to after a logout. This can be used together with
|
||||
`PAPERLESS_ENABLE_HTTP_REMOTE_USER` to redirect the user back to the SSO
|
||||
application's logout page.
|
||||
|
||||
Defaults to None, which disables this feature.
|
||||
|
||||
.. _configuration-ocr:
|
||||
|
||||
OCR settings
|
||||
@@ -373,6 +449,24 @@ PAPERLESS_OCR_IMAGE_DPI=<num>
|
||||
Default is none, which will automatically calculate image DPI so that
|
||||
the produced PDF documents are A4 sized.
|
||||
|
||||
PAPERLESS_OCR_MAX_IMAGE_PIXELS=<num>
|
||||
Paperless will raise a warning when OCRing images which are over this limit and
|
||||
will not OCR images which are more than twice this limit. Note this does not
|
||||
prevent the document from being consumed, but could result in missing text content.
|
||||
|
||||
If unset, will default to the value determined by
|
||||
`Pillow <https://pillow.readthedocs.io/en/stable/reference/Image.html#PIL.Image.MAX_IMAGE_PIXELS>`_.
|
||||
|
||||
.. note::
|
||||
|
||||
Increasing this limit could cause Paperless to consume additional resources
|
||||
when consuming a file. Be sure you have sufficient system resources.
|
||||
|
||||
.. caution::
|
||||
|
||||
The limit is intended to prevent malicious files from consuming system resources
|
||||
and causing crashes and other errors. Only increase this value if you are certain
|
||||
your documents are not malicious and you need the text which was not OCRed
|
||||
|
||||
PAPERLESS_OCR_USER_ARGS=<json>
|
||||
OCRmyPDF offers many more options. Use this parameter to specify any
|
||||
@@ -402,7 +496,7 @@ Tika settings
|
||||
#############
|
||||
|
||||
Paperless can make use of `Tika <https://tika.apache.org/>`_ and
|
||||
`Gotenberg <https://thecodingmachine.github.io/gotenberg/>`_ for parsing and
|
||||
`Gotenberg <https://gotenberg.dev/>`_ for parsing and
|
||||
converting "Office" documents (such as ".doc", ".xlsx" and ".odt"). If you
|
||||
wish to use this, you must provide a Tika server and a Gotenberg server,
|
||||
configure their endpoints, and enable the feature.
|
||||
@@ -423,7 +517,7 @@ PAPERLESS_TIKA_GOTENBERG_ENDPOINT=<url>
|
||||
Defaults to "http://localhost:3000".
|
||||
|
||||
If you run paperless on docker, you can add those services to the docker-compose
|
||||
file (see the provided ``docker-compose.tika.yml`` file for reference). The changes
|
||||
file (see the provided ``docker-compose.sqlite-tika.yml`` file for reference). The changes
|
||||
requires are as follows:
|
||||
|
||||
.. code:: yaml
|
||||
@@ -444,19 +538,22 @@ requires are as follows:
|
||||
# ...
|
||||
|
||||
gotenberg:
|
||||
image: thecodingmachine/gotenberg
|
||||
image: gotenberg/gotenberg:7.4
|
||||
restart: unless-stopped
|
||||
environment:
|
||||
DISABLE_GOOGLE_CHROME: 1
|
||||
command:
|
||||
- "gotenberg"
|
||||
- "--chromium-disable-routes=true"
|
||||
|
||||
tika:
|
||||
image: apache/tika
|
||||
image: ghcr.io/paperless-ngx/tika:latest
|
||||
restart: unless-stopped
|
||||
|
||||
Add the configuration variables to the environment of the webserver (alternatively
|
||||
put the configuration in the ``docker-compose.env`` file) and add the additional
|
||||
services below the webserver service. Watch out for indentation.
|
||||
|
||||
Make sure to use the correct format `PAPERLESS_TIKA_ENABLED = 1` so python_dotenv can parse the statement correctly.
|
||||
|
||||
Software tweaks
|
||||
###############
|
||||
|
||||
@@ -465,6 +562,8 @@ PAPERLESS_TASK_WORKERS=<num>
|
||||
maintain the automatic matching algorithm, check emails, consume documents,
|
||||
etc. This variable specifies how many things it will do in parallel.
|
||||
|
||||
Defaults to 1
|
||||
|
||||
|
||||
PAPERLESS_THREADS_PER_WORKER=<num>
|
||||
Furthermore, paperless uses multiple threads when consuming documents to
|
||||
@@ -507,6 +606,16 @@ PAPERLESS_THREADS_PER_WORKER=<num>
|
||||
PAPERLESS_THREADS_PER_WORKER automatically.
|
||||
|
||||
|
||||
PAPERLESS_WORKER_TIMEOUT=<num>
|
||||
Machines with few cores or weak ones might not be able to finish OCR on
|
||||
large documents within the default 1800 seconds. So extending this timeout
|
||||
may prove to be useful on weak hardware setups.
|
||||
|
||||
PAPERLESS_WORKER_RETRY=<num>
|
||||
If PAPERLESS_WORKER_TIMEOUT has been configured, the retry time for a task can
|
||||
also be configured. By default, this value will be set to 10s more than the
|
||||
worker timeout. This value should never be set less than the worker timeout.
|
||||
|
||||
PAPERLESS_TIME_ZONE=<timezone>
|
||||
Set the time zone here.
|
||||
See https://docs.djangoproject.com/en/3.1/ref/settings/#std:setting-TIME_ZONE
|
||||
@@ -526,6 +635,28 @@ PAPERLESS_CONSUMER_POLLING=<num>
|
||||
|
||||
Defaults to 0, which disables polling and uses filesystem notifications.
|
||||
|
||||
PAPERLESS_CONSUMER_POLLING_RETRY_COUNT=<num>
|
||||
If consumer polling is enabled, sets the number of times paperless will check for a
|
||||
file to remain unmodified.
|
||||
|
||||
Defaults to 5.
|
||||
|
||||
PAPERLESS_CONSUMER_POLLING_DELAY=<num>
|
||||
If consumer polling is enabled, sets the delay in seconds between each check (above) paperless
|
||||
will do while waiting for a file to remain unmodified.
|
||||
|
||||
Defaults to 5.
|
||||
|
||||
.. _configuration-inotify:
|
||||
|
||||
PAPERLESS_CONSUMER_INOTIFY_DELAY=<num>
|
||||
Sets the time in seconds the consumer will wait for additional events
|
||||
from inotify before the consumer will consider a file ready and begin consumption.
|
||||
Certain scanners or network setups may generate multiple events for a single file,
|
||||
leading to multiple consumers working on the same file. Configure this to
|
||||
prevent that.
|
||||
|
||||
Defaults to 0.5 seconds.
|
||||
|
||||
PAPERLESS_CONSUMER_DELETE_DUPLICATES=<bool>
|
||||
When the consumer detects a duplicate document, it will not touch the
|
||||
@@ -554,6 +685,37 @@ PAPERLESS_CONSUMER_SUBDIRS_AS_TAGS=<bool>
|
||||
|
||||
Defaults to false.
|
||||
|
||||
PAPERLESS_CONSUMER_ENABLE_BARCODES=<bool>
|
||||
Enables the scanning and page separation based on detected barcodes.
|
||||
This allows for scanning and adding multiple documents per uploaded
|
||||
file, which are separated by one or multiple barcode pages.
|
||||
|
||||
For ease of use, it is suggested to use a standardized separation page,
|
||||
e.g. `here <https://www.alliancegroup.co.uk/patch-codes.htm>`_.
|
||||
|
||||
If no barcodes are detected in the uploaded file, no page separation
|
||||
will happen.
|
||||
|
||||
The original document will be removed and the separated pages will be
|
||||
saved as pdf.
|
||||
|
||||
Defaults to false.
|
||||
|
||||
PAPERLESS_CONSUMER_BARCODE_TIFF_SUPPORT=<bool>
|
||||
Whether TIFF image files should be scanned for barcodes.
|
||||
This will automatically convert any TIFF image(s) to pdfs for later
|
||||
processing.
|
||||
This only has an effect, if PAPERLESS_CONSUMER_ENABLE_BARCODES has been
|
||||
enabled.
|
||||
|
||||
Defaults to false.
|
||||
|
||||
PAPERLESS_CONSUMER_BARCODE_STRING=PATCHT
|
||||
Defines the string to be detected as a separator barcode.
|
||||
If paperless is used with the PATCH-T separator pages, users
|
||||
shouldn't change this.
|
||||
|
||||
Defaults to "PATCHT"
|
||||
|
||||
PAPERLESS_CONVERT_MEMORY_LIMIT=<num>
|
||||
On smaller systems, or even in the case of Very Large Documents, the consumer
|
||||
@@ -578,13 +740,6 @@ PAPERLESS_CONVERT_TMPDIR=<path>
|
||||
|
||||
Default is none, which disables the temporary directory.
|
||||
|
||||
PAPERLESS_OPTIMIZE_THUMBNAILS=<bool>
|
||||
Use optipng to optimize thumbnails. This usually reduces the size of
|
||||
thumbnails by about 20%, but uses considerable compute time during
|
||||
consumption.
|
||||
|
||||
Defaults to true.
|
||||
|
||||
PAPERLESS_POST_CONSUME_SCRIPT=<filename>
|
||||
After a document is consumed, Paperless can trigger an arbitrary script if
|
||||
you like. This script will be passed a number of arguments for you to work
|
||||
@@ -600,8 +755,24 @@ PAPERLESS_FILENAME_DATE_ORDER=<format>
|
||||
The filename will be checked first, and if nothing is found, the document
|
||||
text will be checked as normal.
|
||||
|
||||
A date in a filename must have some separators (`.`, `-`, `/`, etc)
|
||||
for it to be parsed.
|
||||
|
||||
Defaults to none, which disables this feature.
|
||||
|
||||
PAPERLESS_NUMBER_OF_SUGGESTED_DATES=<num>
|
||||
Paperless searches an entire document for dates. The first date found will
|
||||
be used as the initial value for the created date. When this variable is
|
||||
greater than 0 (or left to it's default value), paperless will also suggest
|
||||
other dates found in the document, up to a maximum of this setting. Note that
|
||||
duplicates will be removed, which can result in fewer dates displayed in the
|
||||
frontend than this setting value.
|
||||
|
||||
The task to find all dates can be time-consuming and increases with a higher
|
||||
(maximum) number of suggested dates and slower hardware.
|
||||
|
||||
Defaults to 3. Set to 0 to disable this feature.
|
||||
|
||||
PAPERLESS_THUMBNAIL_FONT_NAME=<filename>
|
||||
Paperless creates thumbnails for plain text files by rendering the content
|
||||
of the file on an image and uses a predefined font for that. This
|
||||
@@ -617,10 +788,7 @@ PAPERLESS_IGNORE_DATES=<string>
|
||||
this process. This is useful for special dates (like date of birth) that appear
|
||||
in documents regularly but are very unlikely to be the documents creation date.
|
||||
|
||||
You may specify dates in a multitude of formats supported by dateparser (see
|
||||
https://dateparser.readthedocs.io/en/latest/#popular-formats) but as the dates
|
||||
need to be comma separated, the options are limited.
|
||||
Example: "2020-12-02,22.04.1999"
|
||||
The date is parsed using the order specified in PAPERLESS_DATE_ORDER
|
||||
|
||||
Defaults to an empty string to not ignore any dates.
|
||||
|
||||
@@ -637,7 +805,7 @@ PAPERLESS_CONSUMER_IGNORE_PATTERNS=<json>
|
||||
|
||||
This can be adjusted by configuring a custom json array with patterns to exclude.
|
||||
|
||||
Defautls to ``[".DS_STORE/*", "._*", ".stfolder/*"]``.
|
||||
Defaults to ``[".DS_STORE/*", "._*", ".stfolder/*", ".stversions/*", ".localized/*", "desktop.ini"]``.
|
||||
|
||||
Binaries
|
||||
########
|
||||
@@ -650,13 +818,10 @@ the program doesn't automatically execute it (ie. the program isn't in your
|
||||
$PATH), then you'll need to specify the literal path for that program.
|
||||
|
||||
PAPERLESS_CONVERT_BINARY=<path>
|
||||
Defaults to "/usr/bin/convert".
|
||||
Defaults to "convert".
|
||||
|
||||
PAPERLESS_GS_BINARY=<path>
|
||||
Defaults to "/usr/bin/gs".
|
||||
|
||||
PAPERLESS_OPTIPNG_BINARY=<path>
|
||||
Defaults to "/usr/bin/optipng".
|
||||
Defaults to "gs".
|
||||
|
||||
|
||||
.. _configuration-docker:
|
||||
@@ -673,9 +838,25 @@ PAPERLESS_WEBSERVER_WORKERS=<num>
|
||||
also loads the entire application into memory separately, so increasing this value
|
||||
will increase RAM usage.
|
||||
|
||||
Consider configuring this to 1 on low power devices with limited amount of RAM.
|
||||
Defaults to 1.
|
||||
|
||||
Defaults to 2.
|
||||
PAPERLESS_BIND_ADDR=<ip address>
|
||||
The IP address the webserver will listen on inside the container. There are
|
||||
special setups where you may need to configure this value to restrict the
|
||||
Ip address or interface the webserver listens on.
|
||||
|
||||
Defaults to [::], meaning all interfaces, including IPv6.
|
||||
|
||||
PAPERLESS_PORT=<port>
|
||||
The port number the webserver will listen on inside the container. There are
|
||||
special setups where you may need this to avoid collisions with other
|
||||
services (like using podman with multiple containers in one pod).
|
||||
|
||||
Don't change this when using Docker. To change the port the webserver is
|
||||
reachable outside of the container, instead refer to the "ports" key in
|
||||
``docker-compose.yml``.
|
||||
|
||||
Defaults to 8000.
|
||||
|
||||
USERMAP_UID=<uid>
|
||||
The ID of the paperless user in the container. Set this to your actual user ID on the
|
||||
@@ -719,3 +900,26 @@ PAPERLESS_OCR_LANGUAGES=<list>
|
||||
PAPERLESS_OCR_LANGUAGE=tur
|
||||
|
||||
Defaults to none, which does not install any additional languages.
|
||||
|
||||
|
||||
.. _configuration-update-checking:
|
||||
|
||||
Update Checking
|
||||
###############
|
||||
|
||||
PAPERLESS_ENABLE_UPDATE_CHECK=<bool>
|
||||
Enable (or disable) the automatic check for available updates. This feature is disabled
|
||||
by default but if it is not explicitly set Paperless-ngx will show a message about this.
|
||||
|
||||
If enabled, the feature works by pinging the the Github API for the latest release e.g.
|
||||
https://api.github.com/repos/paperless-ngx/paperless-ngx/releases/latest
|
||||
to determine whether a new version is available.
|
||||
|
||||
Actual updating of the app must still be performed manually.
|
||||
|
||||
Note that for users of thirdy-party containers e.g. linuxserver.io this notification
|
||||
may be 'ahead' of a new release from the third-party maintainers.
|
||||
|
||||
In either case, no tracking data is collected by the app in any way.
|
||||
|
||||
Defaults to none, which disables the feature.
|
||||
|
@@ -1,145 +0,0 @@
|
||||
.. _contributing:
|
||||
|
||||
Contributing to Paperless
|
||||
#########################
|
||||
|
||||
.. warning::
|
||||
|
||||
This section is not updated to paperless-ng yet.
|
||||
|
||||
Maybe you've been using Paperless for a while and want to add a feature or two,
|
||||
or maybe you've come across a bug that you have some ideas how to solve. The
|
||||
beauty of Free software is that you can see what's wrong and help to get it
|
||||
fixed for everyone!
|
||||
|
||||
|
||||
How to Get Your Changes Rolled Into Paperless
|
||||
=============================================
|
||||
|
||||
If you've found a bug, but don't know how to fix it, you can always post an
|
||||
issue on `GitHub`_ in the hopes that someone will have the time to fix it for
|
||||
you. If however you're the one with the time, pull requests are always
|
||||
welcome, you just have to make sure that your code conforms to a few standards:
|
||||
|
||||
Pep8
|
||||
----
|
||||
|
||||
It's the standard for all Python development, so it's `very well documented`_.
|
||||
The short version is:
|
||||
|
||||
* Lines should wrap at 79 characters
|
||||
* Use ``snake_case`` for variables, ``CamelCase`` for classes, and ``ALL_CAPS``
|
||||
for constants.
|
||||
* Space out your operators: ``stuff + 7`` instead of ``stuff+7``
|
||||
* Two empty lines between classes, and functions, but 1 empty line between
|
||||
class methods.
|
||||
|
||||
There's more to it than that, but if you follow those, you'll probably be
|
||||
alright. When you submit your pull request, there's a pep8 checker that'll
|
||||
look at your code to see if anything is off. If it finds anything, it'll
|
||||
complain at you until you fix it.
|
||||
|
||||
|
||||
Additional Style Guides
|
||||
-----------------------
|
||||
|
||||
Where pep8 is ambiguous, I've tried to be a little more specific. These rules
|
||||
aren't hard-and-fast, but if you can conform to them, I'll appreciate it and
|
||||
spend less time trying to conform your PR before merging:
|
||||
|
||||
|
||||
Function calls
|
||||
..............
|
||||
|
||||
If you're calling a function and that necessitates more than one line of code,
|
||||
please format it like this:
|
||||
|
||||
.. code:: python
|
||||
|
||||
my_function(
|
||||
argument1,
|
||||
kwarg1="x",
|
||||
kwarg2="y"
|
||||
another_really_long_kwarg="some big value"
|
||||
a_kwarg_calling_another_long_function=another_function(
|
||||
another_arg,
|
||||
another_kwarg="kwarg!"
|
||||
)
|
||||
)
|
||||
|
||||
This is all in the interest of code uniformity rather than anything else. If
|
||||
we stick to a style, everything is understandable in the same way.
|
||||
|
||||
|
||||
Quoting Strings
|
||||
...............
|
||||
|
||||
pep8 is a little too open-minded on this for my liking. Python strings should
|
||||
be quoted with double quotes (``"``) except in cases where the resulting string
|
||||
would require too much escaping of a double quote, in which case, a single
|
||||
quoted, or triple-quoted string will do:
|
||||
|
||||
.. code:: python
|
||||
|
||||
my_string = "This is my string"
|
||||
problematic_string = 'This is a "string" with "quotes" in it'
|
||||
|
||||
In HTML templates, please use double-quotes for tag attributes, and single
|
||||
quotes for arguments passed to Django template tags:
|
||||
|
||||
.. code:: html
|
||||
|
||||
<div class="stuff">
|
||||
<a href="{% url 'some-url-name' pk='w00t' %}">link this</a>
|
||||
</div>
|
||||
|
||||
This is to keep linters happy they look at an HTML file and see an attribute
|
||||
closing the ``"`` before it should have been.
|
||||
|
||||
--
|
||||
|
||||
That's all there is in terms of guidelines, so I hope it's not too daunting.
|
||||
|
||||
|
||||
Indentation & Spacing
|
||||
.....................
|
||||
|
||||
When it comes to indentation:
|
||||
|
||||
* For Python, the rule is: follow pep8 and use 4 spaces.
|
||||
* For Javascript, CSS, and HTML, please use 1 tab.
|
||||
|
||||
Additionally, Django templates making use of block elements like ``{% if %}``,
|
||||
``{% for %}``, and ``{% block %}`` etc. should be indented:
|
||||
|
||||
Good:
|
||||
|
||||
.. code:: html
|
||||
|
||||
{% block stuff %}
|
||||
<h1>This is the stuff</h1>
|
||||
{% endblock %}
|
||||
|
||||
Bad:
|
||||
|
||||
.. code:: html
|
||||
|
||||
{% block stuff %}
|
||||
<h1>This is the stuff</h1>
|
||||
{% endblock %}
|
||||
|
||||
|
||||
The Code of Conduct
|
||||
===================
|
||||
|
||||
Paperless has a `code of conduct`_. It's a lot like the other ones you see out
|
||||
there, with a few small changes, but basically it boils down to:
|
||||
|
||||
> Don't be an ass, or you might get banned.
|
||||
|
||||
I'm proud to say that the CoC has never had to be enforced because everyone has
|
||||
been awesome, friendly, and professional.
|
||||
|
||||
.. _GitHub: https://github.com/the-paperless-project/paperless/issues
|
||||
.. _very well documented: https://www.python.org/dev/peps/pep-0008/
|
||||
.. _code of conduct: https://github.com/the-paperless-project/paperless/blob/master/CODE_OF_CONDUCT.md
|