Compare commits

..

901 Commits

Author SHA1 Message Date
5cec8add5e chore: Release
Some checks failed
Continuous integration / Check (push) Successful in 42s
Continuous integration / Trunk (push) Failing after 38s
Continuous integration / Test Suite (push) Successful in 1m20s
Continuous integration / Rustfmt (push) Successful in 32s
Continuous integration / build (push) Successful in 1m20s
Continuous integration / Disallow unused dependencies (push) Successful in 1m0s
2025-04-20 09:46:49 -07:00
0225dbde3a procmail2notmuch: don't run migration code, leave it to server 2025-04-20 09:46:27 -07:00
f84b8fa6c2 chore: Release 2025-04-20 09:38:35 -07:00
979cbcd23e procmail2notmuch: inlude early exit option 2025-04-20 09:37:51 -07:00
b3070e1919 web: use random emoji when search results empty, handle search vs catchup 2025-04-20 09:37:12 -07:00
e5fdde8f30 web: add graphic when search results are empty 2025-04-20 09:07:43 -07:00
7de36bbc3d procmail2notmuch: add sql rule loader 2025-04-20 08:40:06 -07:00
1c4f27902e server: add todo 2025-04-20 08:39:47 -07:00
7ee86f0d2f chore: Release
Some checks failed
Continuous integration / Check (push) Successful in 36s
Continuous integration / Test Suite (push) Successful in 41s
Continuous integration / Trunk (push) Failing after 35s
Continuous integration / Rustfmt (push) Successful in 38s
Continuous integration / build (push) Successful in 47s
Continuous integration / Disallow unused dependencies (push) Successful in 1m57s
2025-04-19 13:19:14 -07:00
a0b06fd5ef chore: Release 2025-04-19 13:17:01 -07:00
630bb20b35 procmail2notmuch: add debug vs notmuchrc modes 2025-04-19 13:16:47 -07:00
17ea2a35cb web: tweak style and behavior of view original link 2025-04-19 13:11:57 -07:00
7d9376d607 Add view original functionality 2025-04-19 12:33:11 -07:00
122e949072 chore: Release
Some checks failed
Continuous integration / Test Suite (push) Successful in 41s
Continuous integration / Check (push) Successful in 1m33s
Continuous integration / Trunk (push) Failing after 35s
Continuous integration / Rustfmt (push) Successful in 56s
Continuous integration / build (push) Successful in 48s
Continuous integration / Disallow unused dependencies (push) Successful in 3m14s
2025-04-16 08:48:35 -07:00
9a69b4c51e web: scroll to top on pagination 2025-04-16 08:47:45 -07:00
251151244b chore: Release
Some checks failed
Continuous integration / Check (push) Successful in 1m29s
Continuous integration / Test Suite (push) Successful in 41s
Continuous integration / Rustfmt (push) Successful in 30s
Continuous integration / Trunk (push) Failing after 1m9s
Continuous integration / build (push) Successful in 47s
Continuous integration / Disallow unused dependencies (push) Successful in 3m19s
2025-04-15 20:38:08 -07:00
9d232b666b server: add debug message for WS connection 2025-04-15 20:37:35 -07:00
1832d77e78 chore: Release
Some checks failed
Continuous integration / Check (push) Successful in 39s
Continuous integration / Test Suite (push) Successful in 41s
Continuous integration / Trunk (push) Failing after 35s
Continuous integration / build (push) Successful in 48s
Continuous integration / Rustfmt (push) Successful in 56s
Continuous integration / Disallow unused dependencies (push) Successful in 54s
2025-04-15 20:30:21 -07:00
aca6bce1ff web: connect to the correct ws endpoint in production 2025-04-15 20:30:02 -07:00
7bb2f405da chore: Release
Some checks failed
Continuous integration / Check (push) Successful in 36s
Continuous integration / Test Suite (push) Successful in 41s
Continuous integration / Trunk (push) Failing after 35s
Continuous integration / Rustfmt (push) Successful in 30s
Continuous integration / build (push) Successful in 48s
Continuous integration / Disallow unused dependencies (push) Successful in 3m8s
2025-04-15 19:33:55 -07:00
60e2824167 server: reenable per-account unread counts 2025-04-15 19:33:32 -07:00
cffc228b3a chore: Release
Some checks failed
Continuous integration / Check (push) Successful in 36s
Continuous integration / Test Suite (push) Successful in 41s
Continuous integration / Trunk (push) Failing after 35s
Continuous integration / Rustfmt (push) Successful in 32s
Continuous integration / build (push) Successful in 49s
Continuous integration / Disallow unused dependencies (push) Successful in 3m24s
2025-04-15 19:25:41 -07:00
318c366d82 server: disable per-email counts in tags, it's breaking production 2025-04-15 19:25:22 -07:00
90d7f79ca0 server: slow refresh interval as procmail should be on demand 2025-04-15 19:24:59 -07:00
3f87038776 web: proxy /notifcation 2025-04-15 18:39:36 -07:00
92b880f03b chore: Release
Some checks failed
Continuous integration / Check (push) Successful in 36s
Continuous integration / Test Suite (push) Successful in 41s
Continuous integration / Trunk (push) Failing after 35s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / build (push) Successful in 48s
Continuous integration / Disallow unused dependencies (push) Successful in 54s
2025-04-15 17:46:18 -07:00
94f1e84857 server: add notification handlers for refreshing mail and news 2025-04-15 17:45:47 -07:00
221b4f10df chore: Release
Some checks failed
Continuous integration / Check (push) Successful in 36s
Continuous integration / Test Suite (push) Successful in 41s
Continuous integration / Trunk (push) Failing after 35s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / build (push) Successful in 47s
Continuous integration / Disallow unused dependencies (push) Successful in 54s
2025-04-15 16:36:40 -07:00
225615f4ea server: move config to cmdline args 2025-04-15 16:36:19 -07:00
b8ef753f85 chore: Release
Some checks failed
Continuous integration / Check (push) Successful in 37s
Continuous integration / Test Suite (push) Successful in 41s
Continuous integration / Trunk (push) Failing after 35s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / build (push) Successful in 49s
Continuous integration / Disallow unused dependencies (push) Successful in 55s
2025-04-15 16:09:29 -07:00
33edd22f8f web: add mock wasm-socket for building on non-wasm 2025-04-15 16:09:19 -07:00
75e9232095 chore: Release 2025-04-15 16:09:19 -07:00
6daddf11de Remove unused dependencies 2025-04-15 16:09:19 -07:00
36d9eda303 chore: Release 2025-04-15 16:09:19 -07:00
4eb2d4c689 cargo sqlx prepare 2025-04-15 16:09:19 -07:00
edc7119fbf server: finish port to axum w/ websockets 2025-04-15 16:09:19 -07:00
aa1736a285 web: highlight button for current search, bring back debug unread 2025-04-15 16:09:19 -07:00
6f93aa4f34 server: poll for new messages and update clients via WS 2025-04-15 16:09:19 -07:00
0662e6230e server: instrument catchup 2025-04-15 16:09:19 -07:00
30f3f14040 web: plumb websocket messages through to UI 2025-04-15 16:09:19 -07:00
f2042f284e Add websocket handler on server, connect from client
Additionally add /test handler that triggers server->client WS message
2025-04-15 16:09:19 -07:00
b2c73ffa15 Try using axum instead of rocket. WS doesn't seem to work through trunk 2025-04-15 16:09:19 -07:00
d7217d1b3c WIP subscription support, will require switching webserver 2025-04-15 16:09:19 -07:00
638d55a36c web: prototype websocket client 2025-04-15 16:09:19 -07:00
b11f6b5149 fix(deps): update rust crate sqlx to v0.8.5
All checks were successful
Continuous integration / Check (push) Successful in 36s
Continuous integration / Test Suite (push) Successful in 41s
Continuous integration / Trunk (push) Successful in 37s
Continuous integration / Rustfmt (push) Successful in 30s
Continuous integration / build (push) Successful in 52s
Continuous integration / Disallow unused dependencies (push) Successful in 3m20s
2025-04-15 22:31:38 +00:00
d0b5ecf4f2 chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 36s
Continuous integration / Test Suite (push) Successful in 41s
Continuous integration / Trunk (push) Successful in 37s
Continuous integration / Rustfmt (push) Successful in 30s
Continuous integration / build (push) Successful in 48s
Continuous integration / Disallow unused dependencies (push) Successful in 3m28s
2025-04-14 08:40:18 -07:00
7a67c30a2c web: make search input larger and disable focus outline 2025-04-14 08:40:10 -07:00
5ea4694eb8 fix(deps): update rust crate sqlx to v0.8.4
All checks were successful
Continuous integration / Check (push) Successful in 40s
Continuous integration / Trunk (push) Successful in 37s
Continuous integration / Rustfmt (push) Successful in 30s
Continuous integration / build (push) Successful in 47s
Continuous integration / Test Suite (push) Successful in 2m44s
Continuous integration / Disallow unused dependencies (push) Successful in 56s
2025-04-14 05:16:45 +00:00
e01dabe6ed chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 36s
Continuous integration / Test Suite (push) Successful in 40s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / build (push) Successful in 48s
Continuous integration / Disallow unused dependencies (push) Successful in 57s
2025-04-13 22:01:29 -07:00
ecaf0dd0fc web: remove unused import 2025-04-13 22:01:17 -07:00
3d4dcc9e6b chore: Release 2025-04-13 20:53:47 -07:00
28a5d9f219 web: add buttons for just unread news and unread mail 2025-04-13 20:53:19 -07:00
81876d37ea web: fix click handling in news post header 2025-04-13 20:53:19 -07:00
4a6b159ddb web: always show bulk-edit checkbox, fix check logic 2025-04-13 20:53:19 -07:00
d84957cc8c web: use current thread, not first !seen in catchup mode 2025-04-13 20:53:19 -07:00
d53db5b49a chore(deps): lock file maintenance
All checks were successful
Continuous integration / Check (push) Successful in 1m5s
Continuous integration / Test Suite (push) Successful in 41s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / Trunk (push) Successful in 1m41s
Continuous integration / build (push) Successful in 48s
Continuous integration / Disallow unused dependencies (push) Successful in 3m16s
2025-04-14 00:46:58 +00:00
0448368011 chore(deps): lock file maintenance
All checks were successful
Continuous integration / Check (push) Successful in 36s
Continuous integration / Test Suite (push) Successful in 41s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / Rustfmt (push) Successful in 32s
Continuous integration / Disallow unused dependencies (push) Successful in 59s
Continuous integration / build (push) Successful in 4m41s
2025-04-14 00:02:00 +00:00
36754136fd chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 35s
Continuous integration / Test Suite (push) Successful in 40s
Continuous integration / Trunk (push) Successful in 37s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / build (push) Successful in 46s
Continuous integration / Disallow unused dependencies (push) Successful in 57s
2025-04-13 08:31:45 -07:00
489acccf77 web: force background color for code snippets 2025-04-13 08:31:20 -07:00
8ef4db63ad fix(deps): update rust crate clap to v4.5.36
All checks were successful
Continuous integration / Check (push) Successful in 37s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / Test Suite (push) Successful in 1m56s
Continuous integration / build (push) Successful in 47s
Continuous integration / Disallow unused dependencies (push) Successful in 3m44s
2025-04-11 20:46:39 +00:00
9f63205ff3 chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 37s
Continuous integration / Test Suite (push) Successful in 40s
Continuous integration / Trunk (push) Successful in 37s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / build (push) Successful in 47s
Continuous integration / Disallow unused dependencies (push) Successful in 57s
2025-04-10 12:35:10 -07:00
5a0378948d web: apply title wrapping on search results page 2025-04-10 12:32:46 -07:00
2b4c45be74 web: conditionally wrap title when large words found 2025-04-10 12:16:53 -07:00
147896dc80 chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 1m20s
Continuous integration / Test Suite (push) Successful in 39s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / build (push) Successful in 47s
Continuous integration / Disallow unused dependencies (push) Successful in 57s
Continuous integration / Trunk (push) Successful in 11m34s
2025-04-09 20:35:49 -07:00
1ff6ec7653 web: wrap long titles on message view 2025-04-09 20:35:33 -07:00
acd590111e chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 36s
Continuous integration / Test Suite (push) Successful in 41s
Continuous integration / Trunk (push) Successful in 45s
Continuous integration / Rustfmt (push) Successful in 54s
Continuous integration / build (push) Successful in 1m35s
Continuous integration / Disallow unused dependencies (push) Successful in 3m30s
2025-04-09 19:17:52 -07:00
b5f24ba1f2 server: strip element sizing attributes and inline style 2025-04-09 19:17:19 -07:00
79ed24135f fix(deps): update rust crate tantivy to 0.24.0
All checks were successful
Continuous integration / Test Suite (push) Successful in 40s
Continuous integration / Check (push) Successful in 1m19s
Continuous integration / Trunk (push) Successful in 37s
Continuous integration / Rustfmt (push) Successful in 37s
Continuous integration / build (push) Successful in 48s
Continuous integration / Disallow unused dependencies (push) Successful in 2m3s
2025-04-09 18:01:42 +00:00
a4949a25b5 fix(deps): update rust crate cacher to 0.2.0
All checks were successful
Continuous integration / Check (push) Successful in 35s
Continuous integration / Test Suite (push) Successful in 40s
Continuous integration / Trunk (push) Successful in 37s
Continuous integration / Rustfmt (push) Successful in 44s
Continuous integration / build (push) Successful in 47s
Continuous integration / Disallow unused dependencies (push) Successful in 2m21s
2025-04-07 03:46:21 +00:00
f16edef124 chore(deps): lock file maintenance
All checks were successful
Continuous integration / Check (push) Successful in 35s
Continuous integration / Trunk (push) Successful in 37s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / build (push) Successful in 1m6s
Continuous integration / Test Suite (push) Successful in 3m2s
Continuous integration / Disallow unused dependencies (push) Successful in 56s
2025-04-07 00:01:51 +00:00
2fd6479cb9 fix(deps): update rust crate tokio to v1.44.2
All checks were successful
Continuous integration / Test Suite (push) Successful in 1m15s
Continuous integration / Trunk (push) Successful in 37s
Continuous integration / Rustfmt (push) Successful in 30s
Continuous integration / build (push) Successful in 45s
Continuous integration / Check (push) Successful in 4m17s
Continuous integration / Disallow unused dependencies (push) Successful in 57s
2025-04-05 15:47:48 +00:00
85a6b3a9a4 chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 36s
Continuous integration / Test Suite (push) Successful in 40s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / Rustfmt (push) Successful in 38s
Continuous integration / build (push) Successful in 48s
Continuous integration / Disallow unused dependencies (push) Successful in 2m6s
2025-04-02 16:53:57 -07:00
9ac5216d6e web: more pre/code css tweaks 2025-04-02 16:53:37 -07:00
82987dbd20 web: tweak stype of code blocks 2025-04-02 16:46:24 -07:00
29de7c0727 chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 36s
Continuous integration / Test Suite (push) Successful in 40s
Continuous integration / Trunk (push) Successful in 37s
Continuous integration / Rustfmt (push) Successful in 37s
Continuous integration / build (push) Successful in 50s
Continuous integration / Disallow unused dependencies (push) Successful in 2m22s
2025-04-02 13:27:18 -07:00
5f6580fa2f web: remove unreachable code 2025-04-02 13:27:02 -07:00
5d4732d75d chore: Release 2025-04-02 12:22:29 -07:00
a13bac813a web: make money stuff mobile friendly 2025-04-02 12:21:54 -07:00
85dcc9f7bd fix(deps): update rust crate clap to v4.5.35
All checks were successful
Continuous integration / Test Suite (push) Successful in 43s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / Check (push) Successful in 1m24s
Continuous integration / Rustfmt (push) Successful in 32s
Continuous integration / build (push) Successful in 1m20s
Continuous integration / Disallow unused dependencies (push) Successful in 56s
2025-04-01 17:31:11 +00:00
b696629ad9 chore(deps): lock file maintenance
All checks were successful
Continuous integration / Check (push) Successful in 37s
Continuous integration / Test Suite (push) Successful in 42s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / Rustfmt (push) Successful in 32s
Continuous integration / Disallow unused dependencies (push) Successful in 57s
Continuous integration / build (push) Successful in 1m29s
2025-03-30 23:46:58 +00:00
b9e3128718 fix(deps): update all non-major dependencies
All checks were successful
Continuous integration / Check (push) Successful in 1m17s
Continuous integration / Test Suite (push) Successful in 39s
Continuous integration / Rustfmt (push) Successful in 32s
Continuous integration / Trunk (push) Successful in 1m4s
Continuous integration / build (push) Successful in 49s
Continuous integration / Disallow unused dependencies (push) Successful in 2m33s
2025-03-30 23:17:15 +00:00
88fac4c2bc chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 38s
Continuous integration / Test Suite (push) Successful in 39s
Continuous integration / Trunk (push) Successful in 37s
Continuous integration / Rustfmt (push) Successful in 45s
Continuous integration / build (push) Successful in 50s
Continuous integration / Disallow unused dependencies (push) Successful in 2m28s
2025-03-30 16:10:01 -07:00
1fad5ec536 server: remove unused dep opentelemetry 2025-03-30 16:09:42 -07:00
8e7214d531 chore: Release
All checks were successful
Continuous integration / Test Suite (push) Successful in 40s
Continuous integration / Check (push) Successful in 1m3s
Continuous integration / Trunk (push) Successful in 39s
Continuous integration / Rustfmt (push) Successful in 38s
Continuous integration / build (push) Successful in 49s
Continuous integration / Disallow unused dependencies (push) Successful in 2m4s
2025-03-30 11:18:44 -07:00
333c4a3ebb server: rewrite old nzbfinder download links 2025-03-30 11:18:19 -07:00
b9ba5a3bea fix(deps): update all non-major dependencies
All checks were successful
Continuous integration / Check (push) Successful in 55s
Continuous integration / Test Suite (push) Successful in 39s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / Rustfmt (push) Successful in 55s
Continuous integration / build (push) Successful in 1m19s
Continuous integration / Disallow unused dependencies (push) Successful in 2m46s
2025-03-20 05:31:31 +00:00
2a0989e74d chore(deps): lock file maintenance
All checks were successful
Continuous integration / Check (push) Successful in 38s
Continuous integration / Trunk (push) Successful in 53s
Continuous integration / Rustfmt (push) Successful in 32s
Continuous integration / build (push) Successful in 51s
Continuous integration / Disallow unused dependencies (push) Successful in 56s
Continuous integration / Test Suite (push) Successful in 4m22s
2025-03-17 00:01:34 +00:00
e9319dc491 fix(deps): update rust crate async-trait to v0.1.88
All checks were successful
Continuous integration / Test Suite (push) Successful in 48s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / Rustfmt (push) Successful in 34s
Continuous integration / build (push) Successful in 47s
Continuous integration / Check (push) Successful in 3m43s
Continuous integration / Disallow unused dependencies (push) Successful in 59s
2025-03-15 01:16:46 +00:00
57481a77cd fix(deps): update rust crate uuid to v1.16.0
All checks were successful
Continuous integration / Check (push) Successful in 36s
Continuous integration / Test Suite (push) Successful in 39s
Continuous integration / Trunk (push) Successful in 37s
Continuous integration / build (push) Successful in 48s
Continuous integration / Rustfmt (push) Successful in 1m10s
Continuous integration / Disallow unused dependencies (push) Successful in 57s
2025-03-14 04:31:07 +00:00
44915cce54 fix(deps): update rust crate tokio to v1.44.1
All checks were successful
Continuous integration / Test Suite (push) Successful in 40s
Continuous integration / Trunk (push) Successful in 39s
Continuous integration / Rustfmt (push) Successful in 32s
Continuous integration / Check (push) Successful in 2m26s
Continuous integration / build (push) Successful in 47s
Continuous integration / Disallow unused dependencies (push) Successful in 3m26s
2025-03-13 08:31:33 +00:00
1225483b57 chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 35s
Continuous integration / Test Suite (push) Successful in 40s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / Rustfmt (push) Successful in 32s
Continuous integration / Disallow unused dependencies (push) Successful in 56s
Continuous integration / build (push) Successful in 5m37s
2025-03-12 16:44:04 -07:00
daeb8c88a1 server: recover on slurp fetch failures 2025-03-12 16:43:48 -07:00
8a6b3ff501 chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 36s
Continuous integration / Test Suite (push) Successful in 39s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / build (push) Successful in 46s
Continuous integration / Trunk (push) Successful in 12m56s
Continuous integration / Disallow unused dependencies (push) Successful in 4m0s
2025-03-12 13:53:27 -07:00
a6fffeafdc web: change autoreload logic 2025-03-12 13:53:11 -07:00
d791b4ce49 chore: Release 2025-03-12 13:50:45 -07:00
8a0e4eb441 web: log all state changes and don't autoreload on error, causes infini-loop 2025-03-12 13:50:39 -07:00
fc84562419 fix(deps): update rust crate reqwest to v0.12.14
All checks were successful
Continuous integration / Test Suite (push) Successful in 40s
Continuous integration / Rustfmt (push) Successful in 32s
Continuous integration / build (push) Successful in 46s
Continuous integration / Disallow unused dependencies (push) Successful in 56s
Continuous integration / Check (push) Successful in 5m27s
Continuous integration / Trunk (push) Successful in 8m3s
2025-03-12 13:46:26 +00:00
37ebe1ebb3 fix(deps): update rust crate reqwest to v0.12.13
All checks were successful
Continuous integration / Test Suite (push) Successful in 1m14s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / Check (push) Successful in 2m30s
Continuous integration / Rustfmt (push) Successful in 32s
Continuous integration / Disallow unused dependencies (push) Successful in 56s
Continuous integration / build (push) Successful in 2m17s
2025-03-11 20:47:18 +00:00
2d06f070ea chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 42s
Continuous integration / Test Suite (push) Successful in 51s
Continuous integration / Trunk (push) Successful in 40s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / Disallow unused dependencies (push) Successful in 56s
Continuous integration / build (push) Successful in 15m50s
2025-03-10 19:38:57 -07:00
527a62069a Revert "web: center contents in cacthup mode"
This reverts commit 1411961e36.
2025-03-10 19:38:32 -07:00
40afafe1a8 fix(deps): update rust crate clap to v4.5.32
All checks were successful
Continuous integration / Test Suite (push) Successful in 55s
Continuous integration / Trunk (push) Successful in 44s
Continuous integration / Rustfmt (push) Successful in 32s
Continuous integration / build (push) Successful in 1m2s
Continuous integration / Disallow unused dependencies (push) Successful in 56s
Continuous integration / Check (push) Successful in 6m34s
2025-03-10 21:01:24 +00:00
e3acf9ae6d chore(deps): lock file maintenance
All checks were successful
Continuous integration / Check (push) Successful in 43s
Continuous integration / Test Suite (push) Successful in 52s
Continuous integration / Trunk (push) Successful in 54s
Continuous integration / build (push) Successful in 1m0s
Continuous integration / Rustfmt (push) Successful in 1m21s
Continuous integration / Disallow unused dependencies (push) Successful in 57s
2025-03-10 00:05:51 +00:00
a68d067a68 fix(deps): update rust crate serde to v1.0.219
All checks were successful
Continuous integration / Check (push) Successful in 42s
Continuous integration / Test Suite (push) Successful in 49s
Continuous integration / Trunk (push) Successful in 44s
Continuous integration / Rustfmt (push) Successful in 55s
Continuous integration / build (push) Successful in 55s
Continuous integration / Disallow unused dependencies (push) Successful in 2m48s
2025-03-09 20:01:48 +00:00
5547c65af0 fix(deps): update rust crate tokio to v1.44.0
All checks were successful
Continuous integration / Check (push) Successful in 40s
Continuous integration / Test Suite (push) Successful in 1m1s
Continuous integration / Trunk (push) Successful in 1m12s
Continuous integration / Rustfmt (push) Successful in 52s
Continuous integration / Disallow unused dependencies (push) Successful in 1m21s
Continuous integration / build (push) Successful in 19m6s
2025-03-09 16:24:42 +00:00
b622bb7d7d chore: Release
All checks were successful
Continuous integration / Test Suite (push) Successful in 47s
Continuous integration / Check (push) Successful in 6m14s
Continuous integration / Trunk (push) Successful in 41s
Continuous integration / build (push) Successful in 54s
Continuous integration / Rustfmt (push) Successful in 1m43s
Continuous integration / Disallow unused dependencies (push) Successful in 58s
2025-03-08 07:57:33 -08:00
43efdf18a0 web: reload page on fetch error. Should help with expired cookies 2025-03-08 07:57:12 -08:00
c71ab8e9e8 chore: Release 2025-03-08 07:52:40 -08:00
408d6ed8ba web: only reload on version skew in release 2025-03-08 07:52:03 -08:00
1411961e36 web: center contents in cacthup mode 2025-03-08 07:52:03 -08:00
dfd7ef466c Only rebuild on push 2025-03-08 07:52:03 -08:00
2aa3dfbd0f fix(deps): update rust crate serde_json to v1.0.140
All checks were successful
Continuous integration / Test Suite (pull_request) Successful in 46s
Continuous integration / Check (pull_request) Successful in 2m2s
Continuous integration / Trunk (pull_request) Successful in 38s
Continuous integration / build (pull_request) Successful in 52s
Continuous integration / Rustfmt (pull_request) Successful in 1m16s
Continuous integration / Disallow unused dependencies (pull_request) Successful in 55s
Continuous integration / Check (push) Successful in 38s
Continuous integration / Trunk (push) Successful in 39s
Continuous integration / Rustfmt (push) Successful in 30s
Continuous integration / build (push) Successful in 52s
Continuous integration / Disallow unused dependencies (push) Successful in 56s
Continuous integration / Test Suite (push) Successful in 4m26s
2025-03-03 09:46:00 +00:00
fba10e27cf fix(deps): update all non-major dependencies
All checks were successful
Continuous integration / Check (pull_request) Successful in 38s
Continuous integration / Test Suite (pull_request) Successful in 46s
Continuous integration / Trunk (pull_request) Successful in 40s
Continuous integration / Rustfmt (pull_request) Successful in 31s
Continuous integration / Disallow unused dependencies (pull_request) Successful in 55s
Continuous integration / build (pull_request) Successful in 3m24s
Continuous integration / Test Suite (push) Successful in 45s
Continuous integration / Trunk (push) Successful in 39s
Continuous integration / Rustfmt (push) Successful in 32s
Continuous integration / Check (push) Successful in 2m43s
Continuous integration / build (push) Successful in 57s
Continuous integration / Disallow unused dependencies (push) Successful in 2m45s
2025-03-03 06:03:25 +00:00
5417c74f9c fix(deps): update rust crate thiserror to v2.0.12
All checks were successful
Continuous integration / Check (pull_request) Successful in 43s
Continuous integration / Test Suite (pull_request) Successful in 46s
Continuous integration / Trunk (pull_request) Successful in 39s
Continuous integration / Rustfmt (pull_request) Successful in 32s
Continuous integration / Disallow unused dependencies (pull_request) Successful in 57s
Continuous integration / build (pull_request) Successful in 2m36s
Continuous integration / Check (push) Successful in 42s
Continuous integration / Test Suite (push) Successful in 45s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / build (push) Successful in 57s
Continuous integration / Trunk (push) Successful in 2m22s
Continuous integration / Disallow unused dependencies (push) Successful in 57s
2025-03-03 04:46:31 +00:00
eb0b0dbe81 chore(deps): lock file maintenance
All checks were successful
Continuous integration / Test Suite (pull_request) Successful in 45s
Continuous integration / Trunk (pull_request) Successful in 38s
Continuous integration / Rustfmt (pull_request) Successful in 30s
Continuous integration / Check (pull_request) Successful in 3m6s
Continuous integration / build (pull_request) Successful in 56s
Continuous integration / Disallow unused dependencies (pull_request) Successful in 2m21s
Continuous integration / Test Suite (push) Successful in 49s
Continuous integration / Trunk (push) Successful in 39s
Continuous integration / Check (push) Successful in 2m37s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / Disallow unused dependencies (push) Successful in 56s
Continuous integration / build (push) Successful in 4m9s
2025-03-03 00:01:36 +00:00
561f522658 fix(deps): update rust crate mailparse to v0.16.1
All checks were successful
Continuous integration / Check (pull_request) Successful in 38s
Continuous integration / Test Suite (pull_request) Successful in 44s
Continuous integration / Trunk (pull_request) Successful in 39s
Continuous integration / Rustfmt (pull_request) Successful in 32s
Continuous integration / Disallow unused dependencies (pull_request) Successful in 56s
Continuous integration / build (pull_request) Successful in 2m50s
Continuous integration / Test Suite (push) Successful in 44s
Continuous integration / Trunk (push) Successful in 39s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / Check (push) Successful in 2m4s
Continuous integration / build (push) Successful in 51s
Continuous integration / Disallow unused dependencies (push) Successful in 2m40s
2025-02-27 23:33:39 +00:00
32d2ffeb3d chore: Release
All checks were successful
Continuous integration / Test Suite (push) Successful in 44s
Continuous integration / Trunk (push) Successful in 41s
Continuous integration / Rustfmt (push) Successful in 32s
Continuous integration / Check (push) Successful in 2m37s
Continuous integration / build (push) Successful in 56s
Continuous integration / Disallow unused dependencies (push) Successful in 2m36s
2025-02-27 15:16:09 -08:00
d41946e0a5 web: change style for mark read catchup button 2025-02-27 15:15:49 -08:00
61402858f4 web: add TODO 2025-02-27 15:15:42 -08:00
17de318645 chore: Release
All checks were successful
Continuous integration / Test Suite (push) Successful in 47s
Continuous integration / Check (push) Successful in 2m0s
Continuous integration / Trunk (push) Successful in 39s
Continuous integration / build (push) Successful in 52s
Continuous integration / Rustfmt (push) Successful in 1m6s
Continuous integration / Disallow unused dependencies (push) Successful in 56s
2025-02-26 15:43:34 -08:00
3aa0144e8d web: try setting history.scroll_restoration to manual to impove inter-page flow 2025-02-26 15:43:18 -08:00
f9eafff4c7 web: add "go home" button to catchup view 2025-02-26 15:43:18 -08:00
4c6d67901d fix(deps): update rust crate uuid to v1.15.1
All checks were successful
Continuous integration / Check (pull_request) Successful in 38s
Continuous integration / Test Suite (pull_request) Successful in 43s
Continuous integration / Trunk (pull_request) Successful in 39s
Continuous integration / Rustfmt (pull_request) Successful in 32s
Continuous integration / Disallow unused dependencies (pull_request) Successful in 55s
Continuous integration / build (pull_request) Successful in 3m42s
Continuous integration / Check (push) Successful in 37s
Continuous integration / Test Suite (push) Successful in 45s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / build (push) Successful in 50s
Continuous integration / Trunk (push) Successful in 1m39s
Continuous integration / Disallow unused dependencies (push) Successful in 55s
2025-02-26 21:15:57 +00:00
e9aa97a089 fix(deps): update rust crate chrono to v0.4.40
All checks were successful
Continuous integration / Check (pull_request) Successful in 37s
Continuous integration / Test Suite (pull_request) Successful in 43s
Continuous integration / Trunk (pull_request) Successful in 38s
Continuous integration / build (pull_request) Successful in 51s
Continuous integration / Rustfmt (pull_request) Successful in 1m4s
Continuous integration / Disallow unused dependencies (pull_request) Successful in 55s
Continuous integration / Check (push) Successful in 37s
Continuous integration / Test Suite (push) Successful in 43s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / build (push) Successful in 50s
Continuous integration / Trunk (push) Successful in 1m37s
Continuous integration / Disallow unused dependencies (push) Successful in 55s
2025-02-26 08:46:20 +00:00
a82b047f75 fix(deps): update rust crate uuid to v1.15.0
All checks were successful
Continuous integration / Check (pull_request) Successful in 39s
Continuous integration / Test Suite (pull_request) Successful in 43s
Continuous integration / Trunk (pull_request) Successful in 38s
Continuous integration / Rustfmt (pull_request) Successful in 57s
Continuous integration / build (pull_request) Successful in 50s
Continuous integration / Disallow unused dependencies (pull_request) Successful in 2m27s
Continuous integration / Check (push) Successful in 38s
Continuous integration / Test Suite (push) Successful in 43s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / build (push) Successful in 49s
Continuous integration / Trunk (push) Successful in 1m31s
Continuous integration / Disallow unused dependencies (push) Successful in 56s
2025-02-26 06:16:01 +00:00
9a8b44a8df fix(deps): update all non-major dependencies to 0.0.40
All checks were successful
Continuous integration / Test Suite (pull_request) Successful in 44s
Continuous integration / Check (pull_request) Successful in 1m48s
Continuous integration / Trunk (pull_request) Successful in 38s
Continuous integration / build (pull_request) Successful in 50s
Continuous integration / Rustfmt (pull_request) Successful in 1m3s
Continuous integration / Disallow unused dependencies (pull_request) Successful in 1m0s
Continuous integration / Test Suite (push) Successful in 43s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / Check (push) Successful in 1m49s
Continuous integration / Rustfmt (push) Successful in 30s
Continuous integration / Disallow unused dependencies (push) Successful in 54s
Continuous integration / build (push) Successful in 2m43s
2025-02-26 04:47:10 +00:00
a96693004c chore: Release
All checks were successful
Continuous integration / Test Suite (push) Successful in 42s
Continuous integration / Check (push) Successful in 2m9s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / build (push) Successful in 50s
Continuous integration / Rustfmt (push) Successful in 1m7s
Continuous integration / Disallow unused dependencies (push) Successful in 56s
2025-02-25 20:43:47 -08:00
ed9fe11fbf web: trimmed views for catchup mode 2025-02-25 20:43:27 -08:00
09fb14a796 chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 37s
Continuous integration / Test Suite (push) Successful in 43s
Continuous integration / Trunk (push) Successful in 37s
Continuous integration / Rustfmt (push) Successful in 52s
Continuous integration / build (push) Successful in 50s
Continuous integration / Disallow unused dependencies (push) Successful in 2m28s
2025-02-25 20:08:44 -08:00
58a7936bba web: address lint 2025-02-25 20:08:31 -08:00
cd0ee361f5 chore: Release 2025-02-25 20:06:18 -08:00
77bd5abe0d Don't do incremental builds when release 2025-02-25 20:06:11 -08:00
450c5496b3 chore: Release 2025-02-25 20:04:01 -08:00
4411e45a3c Don't allow warnings when publishing 2025-02-25 20:03:40 -08:00
e7d20896d5 web: remove unnecessary Msg variant 2025-02-25 16:20:32 -08:00
32a1115abd chore: Release
Some checks failed
Continuous integration / Check (push) Successful in 38s
Continuous integration / Test Suite (push) Successful in 45s
Continuous integration / Trunk (push) Failing after 36s
Continuous integration / Rustfmt (push) Successful in 30s
Continuous integration / Disallow unused dependencies (push) Successful in 54s
Continuous integration / build (push) Successful in 2m44s
2025-02-25 15:58:46 -08:00
4982057500 web: more scroll to top improvements by reworking URL changes 2025-02-25 15:58:24 -08:00
8977f8bab5 chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 40s
Continuous integration / Test Suite (push) Successful in 43s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / Rustfmt (push) Successful in 30s
Continuous integration / build (push) Successful in 49s
Continuous integration / Disallow unused dependencies (push) Successful in 2m39s
2025-02-25 13:51:38 -08:00
0962a6b3cf web: improve scroll-to-top behavior 2025-02-25 13:51:11 -08:00
3c72929a4f web: enable properly styled buttons 2025-02-25 10:26:16 -08:00
e4eb495a70 web: properly exit catchup mode when done 2025-02-25 10:25:28 -08:00
00e8b0342e chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 37s
Continuous integration / Test Suite (push) Successful in 41s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / Rustfmt (push) Successful in 30s
Continuous integration / Disallow unused dependencies (push) Successful in 54s
Continuous integration / build (push) Successful in 2m46s
2025-02-24 18:41:19 -08:00
b1f9867c06 web: remove debug statement 2025-02-24 18:41:00 -08:00
77943b3570 web: scroll to top on page changes 2025-02-24 18:39:47 -08:00
45e4edb1dd web: add icons to catchup controls 2025-02-24 17:09:16 -08:00
9bf53afebf server: sort catchup ids by timestamp across all sources 2025-02-24 17:08:57 -08:00
e1a502ac4b chore: Release
All checks were successful
Continuous integration / Test Suite (push) Successful in 44s
Continuous integration / Check (push) Successful in 2m1s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / Rustfmt (push) Successful in 1m5s
Continuous integration / build (push) Successful in 50s
Continuous integration / Disallow unused dependencies (push) Successful in 2m38s
2025-02-24 14:56:17 -08:00
9346c46e62 web: change exit catchup behavior to view current message 2025-02-24 14:55:51 -08:00
1452746305 chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 47s
Continuous integration / Test Suite (push) Successful in 44s
Continuous integration / Trunk (push) Successful in 39s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / Disallow unused dependencies (push) Successful in 56s
Continuous integration / build (push) Successful in 2m43s
2025-02-24 14:38:44 -08:00
2e526dace1 Implement catchup mode
Show original/delivered To if no xinu.tv addresses in To/CC fields
2025-02-24 14:38:18 -08:00
76be5b7cac fix(deps): update rust crate clap to v4.5.31
All checks were successful
Continuous integration / Check (pull_request) Successful in 39s
Continuous integration / Test Suite (pull_request) Successful in 43s
Continuous integration / Trunk (pull_request) Successful in 38s
Continuous integration / Rustfmt (pull_request) Successful in 32s
Continuous integration / Disallow unused dependencies (pull_request) Successful in 56s
Continuous integration / build (pull_request) Successful in 2m52s
Continuous integration / Check (push) Successful in 38s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / Test Suite (push) Successful in 1m45s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / Disallow unused dependencies (push) Successful in 55s
Continuous integration / build (push) Successful in 3m25s
2025-02-24 16:00:55 +00:00
3f0b2caedf fix(deps): update rust crate scraper to 0.23.0
All checks were successful
Continuous integration / Check (pull_request) Successful in 38s
Continuous integration / Test Suite (pull_request) Successful in 44s
Continuous integration / Trunk (pull_request) Successful in 39s
Continuous integration / Rustfmt (pull_request) Successful in 32s
Continuous integration / Disallow unused dependencies (pull_request) Successful in 57s
Continuous integration / build (pull_request) Successful in 2m46s
Continuous integration / Check (push) Successful in 39s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / build (push) Successful in 51s
Continuous integration / Test Suite (push) Successful in 3m28s
Continuous integration / Disallow unused dependencies (push) Successful in 56s
2025-02-24 09:31:24 +00:00
ec6dc35ca8 chore(deps): lock file maintenance
All checks were successful
Continuous integration / Check (pull_request) Successful in 43s
Continuous integration / Test Suite (pull_request) Successful in 47s
Continuous integration / Trunk (pull_request) Successful in 39s
Continuous integration / Rustfmt (pull_request) Successful in 32s
Continuous integration / Disallow unused dependencies (pull_request) Successful in 1m2s
Continuous integration / build (pull_request) Successful in 3m44s
Continuous integration / Check (push) Successful in 39s
Continuous integration / Test Suite (push) Successful in 46s
Continuous integration / Trunk (push) Successful in 39s
Continuous integration / Rustfmt (push) Successful in 50s
Continuous integration / build (push) Successful in 54s
Continuous integration / Disallow unused dependencies (push) Successful in 3m23s
2025-02-24 00:01:18 +00:00
01e1ca927e chore: Release
All checks were successful
Continuous integration / Test Suite (push) Successful in 44s
Continuous integration / Check (push) Successful in 2m0s
Continuous integration / Trunk (push) Successful in 40s
Continuous integration / Rustfmt (push) Successful in 1m0s
Continuous integration / build (push) Successful in 51s
Continuous integration / Disallow unused dependencies (push) Successful in 2m34s
2025-02-23 11:47:04 -08:00
1cc52d6c96 web: show X-Original-To: if To: is missing, fallback to Delivered-To: 2025-02-23 11:46:21 -08:00
e6b3a5b5a9 notmuch & server: plumb Delivered-To and X-Original-To headers 2025-02-23 09:37:09 -08:00
bc4b15a5aa chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 38s
Continuous integration / Test Suite (push) Successful in 42s
Continuous integration / Trunk (push) Successful in 39s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / Disallow unused dependencies (push) Successful in 55s
Continuous integration / build (push) Successful in 2m40s
2025-02-22 17:58:37 -08:00
00f61cf6be server: recursively descend email threads to find all unread recipients 2025-02-22 17:58:07 -08:00
52e24437bd chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 38s
Continuous integration / Test Suite (push) Successful in 45s
Continuous integration / Trunk (push) Successful in 39s
Continuous integration / Rustfmt (push) Successful in 51s
Continuous integration / build (push) Successful in 51s
Continuous integration / Disallow unused dependencies (push) Successful in 2m25s
2025-02-22 17:27:54 -08:00
393ffc8506 notmuch: normalize unread_recipients to lower case 2025-02-22 17:27:30 -08:00
2b6cb6ec6e chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 42s
Continuous integration / Test Suite (push) Successful in 44s
Continuous integration / Trunk (push) Successful in 39s
Continuous integration / Rustfmt (push) Successful in 58s
Continuous integration / build (push) Successful in 53s
Continuous integration / Disallow unused dependencies (push) Successful in 2m37s
2025-02-22 17:24:31 -08:00
0cba3a624c web: add de/select all checkbox with tristate 2025-02-22 17:24:18 -08:00
73433711ca fix(deps): update rust crate xtracing to 0.3.0
All checks were successful
Continuous integration / Check (pull_request) Successful in 38s
Continuous integration / Test Suite (pull_request) Successful in 43s
Continuous integration / Rustfmt (pull_request) Successful in 31s
Continuous integration / build (pull_request) Successful in 51s
Continuous integration / Trunk (pull_request) Successful in 2m32s
Continuous integration / Disallow unused dependencies (pull_request) Successful in 57s
Continuous integration / Test Suite (push) Successful in 49s
Continuous integration / Trunk (push) Successful in 40s
Continuous integration / Check (push) Successful in 1m54s
Continuous integration / Rustfmt (push) Successful in 32s
Continuous integration / Disallow unused dependencies (push) Successful in 59s
Continuous integration / build (push) Successful in 2m28s
2025-02-23 00:02:30 +00:00
965afa6871 Merge pull request 'fix(deps): update rust crate seed_hooks to 0.4.0' (#48) from renovate/seed_hooks-0.x into master
All checks were successful
Continuous integration / Test Suite (push) Successful in 44s
Continuous integration / Check (push) Successful in 1m45s
Continuous integration / Trunk (push) Successful in 40s
Continuous integration / Rustfmt (push) Successful in 1m3s
Continuous integration / build (push) Successful in 53s
Continuous integration / Disallow unused dependencies (push) Successful in 2m33s
Reviewed-on: #48
2025-02-22 15:49:50 -08:00
e70dbaf917 fix(deps): update rust crate seed_hooks to 0.4.0
All checks were successful
Continuous integration / Check (push) Successful in 39s
Continuous integration / Test Suite (push) Successful in 44s
Continuous integration / Rustfmt (push) Successful in 32s
Continuous integration / Trunk (push) Successful in 1m23s
Continuous integration / build (push) Successful in 51s
Continuous integration / Check (pull_request) Successful in 38s
Continuous integration / Test Suite (pull_request) Successful in 44s
Continuous integration / Disallow unused dependencies (push) Successful in 2m38s
Continuous integration / Trunk (pull_request) Successful in 38s
Continuous integration / build (pull_request) Successful in 50s
Continuous integration / Rustfmt (pull_request) Successful in 55s
Continuous integration / Disallow unused dependencies (pull_request) Successful in 57s
2025-02-22 15:18:33 -08:00
6b4ce11743 fix(deps): update rust crate xtracing to v0.2.1
All checks were successful
Continuous integration / Check (pull_request) Successful in 37s
Continuous integration / Test Suite (pull_request) Successful in 43s
Continuous integration / Trunk (pull_request) Successful in 39s
Continuous integration / Rustfmt (pull_request) Successful in 31s
Continuous integration / Disallow unused dependencies (pull_request) Successful in 56s
Continuous integration / build (pull_request) Successful in 2m38s
Continuous integration / Check (push) Successful in 38s
Continuous integration / Trunk (push) Successful in 39s
Continuous integration / Rustfmt (push) Successful in 32s
Continuous integration / build (push) Successful in 49s
Continuous integration / Disallow unused dependencies (push) Successful in 57s
Continuous integration / Test Suite (push) Successful in 3m44s
2025-02-22 22:31:55 +00:00
d1980a55a7 fix(deps): update rust crate cacher to v0.1.5
All checks were successful
Continuous integration / Check (pull_request) Successful in 36s
Continuous integration / Test Suite (pull_request) Successful in 42s
Continuous integration / Trunk (pull_request) Successful in 38s
Continuous integration / build (pull_request) Successful in 50s
Continuous integration / Rustfmt (pull_request) Successful in 52s
Continuous integration / Disallow unused dependencies (pull_request) Successful in 56s
Continuous integration / Test Suite (push) Successful in 44s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / Check (push) Successful in 2m4s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / Disallow unused dependencies (push) Successful in 55s
Continuous integration / build (push) Successful in 2m41s
2025-02-22 21:16:46 +00:00
8b78b39d4c chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 37s
Continuous integration / Test Suite (push) Successful in 42s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / Disallow unused dependencies (push) Successful in 55s
Continuous integration / build (push) Successful in 2m29s
2025-02-22 13:10:03 -08:00
ae17651eb5 Normalize Justfile config 2025-02-22 13:08:15 -08:00
22fd8409f6 chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 40s
Continuous integration / Test Suite (push) Successful in 42s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / Rustfmt (push) Successful in 52s
Continuous integration / build (push) Successful in 50s
Continuous integration / Disallow unused dependencies (push) Successful in 2m21s
2025-02-22 12:41:57 -08:00
d0a4ba417f chore: Release 2025-02-22 12:41:30 -08:00
7b09b098a4 chore: Release 2025-02-22 12:41:15 -08:00
bd4c10a8fb Specify registry for all letterbox-* deps 2025-02-22 12:41:15 -08:00
ed3c5f152e chore: Release 2025-02-22 12:41:15 -08:00
63232d1e92 Publish only to xinu 2025-02-22 12:41:15 -08:00
4a3eba80d5 chore: Release 2025-02-22 12:41:15 -08:00
71d3745342 Try relative paths for letterbox-* deps 2025-02-22 12:41:14 -08:00
5fdc98633d chore: Release 2025-02-22 12:39:39 -08:00
57877f268d Set repository in workspace 2025-02-22 12:39:20 -08:00
871a93d58f Move most package metadata to workspace 2025-02-22 12:39:20 -08:00
4b7cbd4f9b chore: Release 2025-02-22 12:39:19 -08:00
aa2a9815df Add automatic per-email address unread folders 2025-02-22 12:38:57 -08:00
2e5b18a008 Fix cargo-udeps build step 2025-02-22 12:37:27 -08:00
d0a38114cc Add cargo-udeps build step 2025-02-22 12:37:27 -08:00
ccc1d516c7 fix(deps): update rust crate letterbox-notmuch to 0.8.0
All checks were successful
Continuous integration / Test Suite (pull_request) Successful in 40s
Continuous integration / Trunk (pull_request) Successful in 38s
Continuous integration / Check (pull_request) Successful in 1m44s
Continuous integration / Rustfmt (pull_request) Successful in 33s
Continuous integration / build (pull_request) Successful in 1m57s
Continuous integration / Check (push) Successful in 38s
Continuous integration / Trunk (push) Successful in 40s
Continuous integration / Test Suite (push) Successful in 1m45s
Continuous integration / Rustfmt (push) Successful in 33s
Continuous integration / build (push) Successful in 2m18s
2025-02-22 19:15:52 +00:00
246b710fdd fix(deps): update rust crate log to v0.4.26
All checks were successful
Continuous integration / Check (pull_request) Successful in 35s
Continuous integration / Test Suite (pull_request) Successful in 39s
Continuous integration / Trunk (pull_request) Successful in 37s
Continuous integration / Rustfmt (pull_request) Successful in 31s
Continuous integration / build (pull_request) Successful in 47s
Continuous integration / Test Suite (push) Successful in 40s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / Check (push) Successful in 1m49s
Continuous integration / Rustfmt (push) Successful in 32s
Continuous integration / build (push) Successful in 2m45s
2025-02-21 05:46:06 +00:00
1a21c9fa8e fix(deps): update rust crate uuid to v1.14.0
All checks were successful
Continuous integration / Check (pull_request) Successful in 53s
Continuous integration / Test Suite (pull_request) Successful in 39s
Continuous integration / Trunk (pull_request) Successful in 37s
Continuous integration / Rustfmt (pull_request) Successful in 33s
Continuous integration / build (pull_request) Successful in 1m18s
Continuous integration / Check (push) Successful in 35s
Continuous integration / Test Suite (push) Successful in 39s
Continuous integration / Trunk (push) Successful in 37s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / build (push) Successful in 45s
2025-02-21 00:30:51 +00:00
9fd912b1d4 fix(deps): update rust crate serde to v1.0.218
All checks were successful
Continuous integration / Test Suite (pull_request) Successful in 41s
Continuous integration / Trunk (pull_request) Successful in 38s
Continuous integration / Check (pull_request) Successful in 1m51s
Continuous integration / Rustfmt (pull_request) Successful in 32s
Continuous integration / build (pull_request) Successful in 3m5s
Continuous integration / Check (push) Successful in 48s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / build (push) Successful in 48s
Continuous integration / Test Suite (push) Successful in 2m52s
2025-02-20 05:31:10 +00:00
9ded32f97b fix(deps): update rust crate anyhow to v1.0.96
All checks were successful
Continuous integration / Test Suite (pull_request) Successful in 40s
Continuous integration / Trunk (pull_request) Successful in 37s
Continuous integration / Check (pull_request) Successful in 1m52s
Continuous integration / Rustfmt (pull_request) Successful in 32s
Continuous integration / build (pull_request) Successful in 2m7s
Continuous integration / Check (push) Successful in 35s
Continuous integration / Test Suite (push) Successful in 39s
Continuous integration / Rustfmt (push) Successful in 32s
Continuous integration / Trunk (push) Successful in 1m26s
Continuous integration / build (push) Successful in 46s
2025-02-20 03:16:55 +00:00
10aac046bc fix(deps): update rust crate serde_json to v1.0.139
All checks were successful
Continuous integration / Check (pull_request) Successful in 36s
Continuous integration / Test Suite (pull_request) Successful in 40s
Continuous integration / Rustfmt (pull_request) Successful in 31s
Continuous integration / Trunk (pull_request) Successful in 1m24s
Continuous integration / build (pull_request) Successful in 47s
Continuous integration / Test Suite (push) Successful in 40s
Continuous integration / Trunk (push) Successful in 37s
Continuous integration / Check (push) Successful in 1m42s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / build (push) Successful in 1m55s
2025-02-20 03:00:53 +00:00
f4527baf89 fix(deps): update rust crate seed_hooks to 0.4.0
All checks were successful
Continuous integration / Check (pull_request) Successful in 1m25s
Continuous integration / Test Suite (pull_request) Successful in 39s
Continuous integration / Rustfmt (pull_request) Successful in 33s
Continuous integration / build (pull_request) Successful in 46s
Continuous integration / Trunk (pull_request) Successful in 1m37s
Continuous integration / Check (push) Successful in 54s
Continuous integration / Trunk (push) Successful in 36s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / build (push) Successful in 46s
Continuous integration / Test Suite (push) Successful in 4m11s
2025-02-18 20:15:48 +00:00
11ec5bf747 fix(deps): update rust crate uuid to v1.13.2
All checks were successful
Continuous integration / Test Suite (pull_request) Successful in 40s
Continuous integration / Check (pull_request) Successful in 1m30s
Continuous integration / Trunk (pull_request) Successful in 37s
Continuous integration / Rustfmt (pull_request) Successful in 47s
Continuous integration / build (pull_request) Successful in 48s
Continuous integration / Check (push) Successful in 36s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / Test Suite (push) Successful in 2m8s
Continuous integration / build (push) Successful in 47s
2025-02-17 23:46:05 +00:00
6a53679755 fix(deps): update rust crate clap to v4.5.30
All checks were successful
Continuous integration / Check (pull_request) Successful in 37s
Continuous integration / Trunk (pull_request) Successful in 37s
Continuous integration / Rustfmt (pull_request) Successful in 31s
Continuous integration / Test Suite (pull_request) Successful in 2m0s
Continuous integration / build (pull_request) Successful in 46s
Continuous integration / Check (push) Successful in 35s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / Test Suite (push) Successful in 1m55s
Continuous integration / build (push) Successful in 47s
2025-02-17 19:15:50 +00:00
7bedec0692 chore(deps): lock file maintenance
All checks were successful
Continuous integration / Test Suite (pull_request) Successful in 41s
Continuous integration / Check (pull_request) Successful in 1m34s
Continuous integration / Trunk (pull_request) Successful in 37s
Continuous integration / Rustfmt (pull_request) Successful in 50s
Continuous integration / build (pull_request) Successful in 47s
Continuous integration / Check (push) Successful in 36s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / build (push) Successful in 47s
Continuous integration / Test Suite (push) Successful in 2m48s
2025-02-17 00:01:14 +00:00
78feb95811 chore: Release
All checks were successful
Continuous integration / Test Suite (push) Successful in 1m27s
Continuous integration / Check (push) Successful in 1m39s
Continuous integration / Trunk (push) Successful in 45s
Continuous integration / Rustfmt (push) Successful in 53s
Continuous integration / build (push) Successful in 1m10s
2025-02-15 14:49:11 -08:00
3aad2bb80e web: another attempt to fix progress bar 2025-02-15 14:47:32 -08:00
0df8de3661 web: use seed_hooks ability to create ev handlers 2025-02-15 14:47:32 -08:00
83ecc73fbd fix(deps): update rust crate seed_hooks to v0.1.16
All checks were successful
Continuous integration / Test Suite (pull_request) Successful in 41s
Continuous integration / Check (pull_request) Successful in 1m24s
Continuous integration / Trunk (pull_request) Successful in 39s
Continuous integration / Rustfmt (pull_request) Successful in 48s
Continuous integration / build (pull_request) Successful in 48s
Continuous integration / Check (push) Successful in 35s
Continuous integration / Trunk (push) Successful in 37s
Continuous integration / Test Suite (push) Successful in 1m41s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / build (push) Successful in 1m49s
2025-02-14 01:15:49 +00:00
c10313cd12 fix(deps): update rust crate letterbox-shared to 0.6.0
All checks were successful
Continuous integration / Test Suite (pull_request) Successful in 39s
Continuous integration / Check (pull_request) Successful in 1m24s
Continuous integration / Trunk (pull_request) Successful in 37s
Continuous integration / Rustfmt (pull_request) Successful in 48s
Continuous integration / build (pull_request) Successful in 46s
Continuous integration / Check (push) Successful in 35s
Continuous integration / Trunk (push) Successful in 37s
Continuous integration / Rustfmt (push) Successful in 32s
Continuous integration / Test Suite (push) Successful in 1m44s
Continuous integration / build (push) Successful in 46s
2025-02-13 23:31:34 +00:00
4c98bcd9cb Merge pull request 'fix(deps): update rust crate letterbox-notmuch to 0.6.0' (#34) from renovate/letterbox-notmuch-0.x into master
All checks were successful
Continuous integration / Check (push) Successful in 35s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / Rustfmt (push) Successful in 32s
Continuous integration / Test Suite (push) Successful in 1m52s
Continuous integration / build (push) Successful in 46s
Reviewed-on: #34
2025-02-13 15:17:39 -08:00
004de235a8 fix(deps): update rust crate letterbox-notmuch to 0.6.0
Some checks failed
renovate/artifacts Artifact file update failure
Continuous integration / Check (push) Successful in 36s
Continuous integration / Test Suite (push) Successful in 39s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / Rustfmt (push) Successful in 52s
Continuous integration / build (push) Successful in 47s
Continuous integration / Test Suite (pull_request) Successful in 40s
Continuous integration / Check (pull_request) Successful in 1m20s
Continuous integration / Trunk (pull_request) Successful in 37s
Continuous integration / Rustfmt (pull_request) Successful in 51s
Continuous integration / build (pull_request) Successful in 48s
2025-02-13 23:16:31 +00:00
90dbeb6f20 chore: Release
All checks were successful
Continuous integration / Test Suite (push) Successful in 40s
Continuous integration / Trunk (push) Successful in 37s
Continuous integration / Check (push) Successful in 1m27s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / build (push) Successful in 1m54s
2025-02-13 15:09:58 -08:00
9aa298febe web: use crate version of seed_hooks 2025-02-13 15:09:34 -08:00
5a13a497dc chore: Release 2025-02-13 14:30:47 -08:00
37711e14dd chore: Release 2025-02-13 14:01:24 -08:00
e89fd28707 web: pin seed_hooks version 2025-02-13 14:01:06 -08:00
7a91ee2f49 chore: Release 2025-02-13 13:29:52 -08:00
4b76ea5392 Justfile: run release w/ --no-confirm 2025-02-13 13:29:29 -08:00
d2a81b7bd9 Revert "Justfile: try without --workspace flag"
This reverts commit 9dd39509b5.
2025-02-13 13:29:17 -08:00
9dd39509b5 Justfile: try without --workspace flag 2025-02-13 13:28:35 -08:00
d605bcfe7a web: move to version 0.3 to sync with other crates 2025-02-13 13:25:01 -08:00
73abdb535a Justfile: actually call _release on build 2025-02-13 11:56:09 -08:00
ab9506c4f6 Starter justfile that will hopefully replace make 2025-02-13 11:51:59 -08:00
994a629401 web: update letterbox-notmuch dependency
All checks were successful
Continuous integration / Check (push) Successful in 37s
Continuous integration / Trunk (push) Successful in 50s
Continuous integration / Rustfmt (push) Successful in 32s
Continuous integration / build (push) Successful in 46s
Continuous integration / Test Suite (push) Successful in 2m41s
2025-02-13 11:37:32 -08:00
00c55160a7 Add web back to workspace 2025-02-13 11:31:43 -08:00
e3c6edb894 Merge pull request 'fix(deps): update rust crate letterbox-shared to 0.3.0' (#35) from renovate/letterbox-shared-0.x into master
Some checks failed
Continuous integration / Test Suite (push) Successful in 41s
Continuous integration / Trunk (push) Failing after 32s
Continuous integration / Check (push) Successful in 1m23s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / build (push) Successful in 1m49s
Reviewed-on: #35
2025-02-13 11:31:21 -08:00
4574c016cd fix(deps): update rust crate letterbox-shared to 0.3.0
Some checks failed
renovate/artifacts Artifact file update failure
Continuous integration / Test Suite (push) Successful in 39s
Continuous integration / Check (push) Successful in 1m12s
Continuous integration / Trunk (push) Failing after 33s
Continuous integration / Rustfmt (push) Successful in 47s
Continuous integration / build (push) Successful in 46s
Continuous integration / Test Suite (pull_request) Successful in 39s
Continuous integration / Check (pull_request) Successful in 1m16s
Continuous integration / Trunk (pull_request) Failing after 32s
Continuous integration / Rustfmt (pull_request) Successful in 47s
Continuous integration / build (pull_request) Successful in 46s
2025-02-13 18:45:52 +00:00
ca6c19f4c8 chore: Release
Some checks failed
Continuous integration / Check (push) Successful in 35s
Continuous integration / Test Suite (push) Successful in 39s
Continuous integration / Trunk (push) Failing after 6m23s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / build (push) Successful in 1m47s
2025-02-13 10:32:43 -08:00
0f51f6e71f server: copy vars.css from web so I can publish release 2025-02-13 10:32:20 -08:00
4bd672bf94 chore: Release 2025-02-13 10:18:40 -08:00
136fd77f3b Add server back to workspace 2025-02-13 10:18:30 -08:00
ee9b6be95e Temporarily remove web and server from workspace to publish other crates
Some checks failed
Continuous integration / Test Suite (push) Successful in 28s
Continuous integration / Check (push) Successful in 42s
Continuous integration / Trunk (push) Failing after 28s
Continuous integration / Rustfmt (push) Successful in 36s
Continuous integration / build (push) Successful in 27s
2025-02-13 10:16:55 -08:00
38c553d385 Use packaged version of crates 2025-02-13 10:16:36 -08:00
1b073665a7 chore: Release 2025-02-13 09:49:11 -08:00
2076596f50 Rename all crates to start with letterbox- 2025-02-13 09:48:24 -08:00
d1beaded09 Update Cargo.toml for packaging 2025-02-13 09:47:41 -08:00
2562bdfedf server: tool for testing inline code 2025-02-13 09:47:41 -08:00
86c6face7d server: sql to debug search indexing w/ postgres 2025-02-13 09:47:41 -08:00
4a7ff8bf7b notmuch: exclude testdata dir when packaging
Contains filenames cargo package doesn't like
2025-02-13 09:47:41 -08:00
8c280d3616 web: fix styling for slashdot's story byline 2025-02-13 09:47:41 -08:00
eb4d4164ef web: fix progress bar on mobile 2025-02-13 09:47:41 -08:00
c7740811bf fix(deps): update rust crate opentelemetry to 0.28.0
All checks were successful
Continuous integration / Test Suite (pull_request) Successful in 38s
Continuous integration / Check (pull_request) Successful in 1m22s
Continuous integration / Trunk (pull_request) Successful in 38s
Continuous integration / Rustfmt (pull_request) Successful in 51s
Continuous integration / build (pull_request) Successful in 46s
Continuous integration / Test Suite (push) Successful in 39s
Continuous integration / Trunk (push) Successful in 37s
Continuous integration / Check (push) Successful in 1m22s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / build (push) Successful in 1m49s
2025-02-12 21:30:57 +00:00
55679cf61b fix(deps): update rust crate xtracing to 0.2.0
All checks were successful
Continuous integration / Test Suite (pull_request) Successful in 38s
Continuous integration / Check (pull_request) Successful in 1m18s
Continuous integration / Trunk (pull_request) Successful in 37s
Continuous integration / Rustfmt (pull_request) Successful in 50s
Continuous integration / build (pull_request) Successful in 46s
Continuous integration / Test Suite (push) Successful in 39s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / Check (push) Successful in 1m35s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / build (push) Successful in 1m42s
2025-02-12 21:15:55 +00:00
1b1c80b1b8 web: annotate some more (temporary) dead code
All checks were successful
Continuous integration / Check (push) Successful in 36s
Continuous integration / Trunk (push) Successful in 37s
Continuous integration / Rustfmt (push) Successful in 32s
Continuous integration / Test Suite (push) Successful in 1m52s
Continuous integration / build (push) Successful in 46s
2025-02-12 13:03:45 -08:00
8743b1f56b web: install trunk in CI
Some checks failed
Continuous integration / Test Suite (push) Successful in 39s
Continuous integration / Check (push) Successful in 1m17s
Continuous integration / Rustfmt (push) Successful in 50s
Continuous integration / build (push) Successful in 1m41s
Continuous integration / Trunk (push) Failing after 3m38s
2025-02-12 11:46:31 -08:00
eb6f1b5346 web: run trunk build in CI
Some checks failed
Continuous integration / Check (push) Successful in 35s
Continuous integration / Test Suite (push) Successful in 40s
Continuous integration / Rustfmt (push) Successful in 32s
Continuous integration / Trunk (push) Failing after 58s
Continuous integration / build (push) Successful in 46s
2025-02-12 09:03:37 -08:00
6bb6d380a9 Bumping version to 0.0.144
All checks were successful
Continuous integration / Check (push) Successful in 37s
Continuous integration / Rustfmt (push) Successful in 51s
Continuous integration / Test Suite (push) Successful in 1m44s
Continuous integration / build (push) Successful in 53s
2025-02-12 08:50:09 -08:00
39eea04bf6 Bumping version to 0.0.143 2025-02-12 08:50:04 -08:00
2711147cd6 web: hide nautilus ads 2025-02-12 08:50:04 -08:00
083b7c9f1c Merge pull request 'fix(deps): update rust crate thiserror to v2' (#27) from renovate/thiserror-2.x into master
All checks were successful
Continuous integration / Check (push) Successful in 34s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / Test Suite (push) Successful in 1m45s
Continuous integration / build (push) Successful in 45s
Reviewed-on: #27
2025-02-11 20:27:41 -08:00
5ade886a72 fix(deps): update rust-wasm-bindgen monorepo
All checks were successful
Continuous integration / Test Suite (pull_request) Successful in 38s
Continuous integration / Check (pull_request) Successful in 1m32s
Continuous integration / Rustfmt (pull_request) Successful in 31s
Continuous integration / build (pull_request) Successful in 2m0s
Continuous integration / Test Suite (push) Successful in 38s
Continuous integration / Check (push) Successful in 1m32s
Continuous integration / Rustfmt (push) Successful in 33s
Continuous integration / build (push) Successful in 2m9s
2025-02-12 00:46:04 +00:00
52575e13f6 Bumping version to 0.0.142
All checks were successful
Continuous integration / Check (push) Successful in 37s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / Test Suite (push) Successful in 1m32s
Continuous integration / build (push) Successful in 46s
2025-02-11 16:42:24 -08:00
3aaee8add3 web: rollback wasm-bindgen 2025-02-11 16:42:10 -08:00
5e188a70f9 fix(deps): update rust crate clap to v4.5.29
All checks were successful
Continuous integration / Test Suite (pull_request) Successful in 39s
Continuous integration / Rustfmt (pull_request) Successful in 49s
Continuous integration / build (pull_request) Successful in 45s
Continuous integration / Check (pull_request) Successful in 3m7s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / build (push) Successful in 44s
Continuous integration / Check (push) Successful in 1m46s
Continuous integration / Test Suite (push) Successful in 5m36s
2025-02-11 20:00:45 +00:00
f9e5c87d2b fix(deps): update rust-wasm-bindgen monorepo
All checks were successful
Continuous integration / Test Suite (pull_request) Successful in 38s
Continuous integration / Check (pull_request) Successful in 1m20s
Continuous integration / Rustfmt (pull_request) Successful in 32s
Continuous integration / build (pull_request) Successful in 1m36s
Continuous integration / Rustfmt (push) Successful in 32s
Continuous integration / Check (push) Successful in 1m22s
Continuous integration / build (push) Successful in 45s
Continuous integration / Test Suite (push) Successful in 5m40s
2025-02-11 16:46:05 +00:00
7d40cf8a4a Bumping version to 0.0.141
All checks were successful
Continuous integration / Check (push) Successful in 37s
Continuous integration / Test Suite (push) Successful in 41s
Continuous integration / Rustfmt (push) Successful in 1m1s
Continuous integration / build (push) Successful in 47s
2025-02-11 08:36:30 -08:00
1836026736 update cacher dependency 2025-02-11 08:36:24 -08:00
79db0f8cfa Bumping version to 0.0.140
All checks were successful
Continuous integration / Check (push) Successful in 36s
Continuous integration / Rustfmt (push) Successful in 32s
Continuous integration / build (push) Successful in 46s
Continuous integration / Test Suite (push) Successful in 5m10s
2025-02-10 17:44:22 -08:00
95c29dc73c web: CSS indent lists 2025-02-10 17:44:07 -08:00
2b0ee42cdc Bumping version to 0.0.139
Some checks are pending
Continuous integration / Check (push) Waiting to run
Continuous integration / Test Suite (push) Waiting to run
Continuous integration / Rustfmt (push) Waiting to run
Continuous integration / build (push) Waiting to run
2025-02-10 17:33:46 -08:00
c90ac1d4fc web: ping web-sys to 0.2.95, to work with CLI in nixos 2025-02-10 17:33:17 -08:00
a9803bb6a1 fix(deps): update rust crate thiserror to v2
Some checks failed
renovate/artifacts Artifact file update failure
Continuous integration / Check (push) Successful in 35s
Continuous integration / Rustfmt (push) Successful in 32s
Continuous integration / build (push) Successful in 46s
Continuous integration / Test Suite (push) Successful in 1m49s
Continuous integration / Check (pull_request) Successful in 48s
Continuous integration / Rustfmt (pull_request) Successful in 32s
Continuous integration / Test Suite (pull_request) Successful in 2m2s
Continuous integration / build (pull_request) Successful in 7m15s
2025-02-11 01:31:42 +00:00
74219ad333 web: fix uuid dep
All checks were successful
Continuous integration / Check (push) Successful in 1m49s
Continuous integration / Rustfmt (push) Successful in 34s
Continuous integration / build (push) Successful in 1m19s
Continuous integration / Test Suite (push) Successful in 4m17s
2025-02-10 17:28:04 -08:00
2073b7b132 Changes necessary for latest cargo packages 2025-02-10 14:57:40 -08:00
58dae5df6f gitea: initial setup
Some checks failed
Continuous integration / Check (push) Failing after 2m48s
Continuous integration / Test Suite (push) Failing after 4m52s
Continuous integration / Rustfmt (push) Successful in 58s
Continuous integration / build (push) Failing after 4m58s
2025-02-09 18:34:07 -08:00
c89fc9b6d4 Merge pull request 'fix(deps): update rust crate mailparse to 0.16.0' (#28) from renovate/mailparse-0.x into master
Reviewed-on: #28
2025-02-09 15:33:57 -08:00
f7ab08c1e6 fix(deps): update rust crate mailparse to 0.16.0 2025-02-09 23:30:40 +00:00
221fead7dc cargo update 2025-02-09 14:54:13 -08:00
3491cb9593 Merge pull request 'fix(deps): update rust crate tokio to v1.43.0' (#24) from renovate/tokio-1.x-lockfile into master
Reviewed-on: #24
2025-02-09 14:52:02 -08:00
037b3231ac fix(deps): update rust crate tokio to v1.43.0 2025-02-09 22:45:44 +00:00
75f38c1e94 Merge pull request 'fix(deps): update rust crate scraper to 0.22.0' (#23) from renovate/scraper-0.x into master
Reviewed-on: #23
2025-02-09 14:30:43 -08:00
977bcd0bf4 Merge pull request 'fix(deps): update rust crate itertools to 0.14.0' (#22) from renovate/itertools-0.x into master
Reviewed-on: #22
2025-02-09 14:30:33 -08:00
838459e5a8 Merge pull request 'fix(deps): update rust crate graphql_client to 0.14.0' (#21) from renovate/graphql_client-0.x into master
Reviewed-on: #21
2025-02-09 14:30:21 -08:00
d208a31348 Merge pull request 'fix(deps): update rust crate gloo-net to 0.6.0' (#20) from renovate/gloo-net-0.x into master
Reviewed-on: #20
2025-02-09 14:30:12 -08:00
0a640bea6f Merge pull request 'fix(deps): update rust crate css-inline to 0.14.0' (#19) from renovate/css-inline-0.x into master
Reviewed-on: #19
2025-02-09 14:30:02 -08:00
84a2962561 Merge pull request 'chore(deps): update dependency font-awesome to v6.7.2' (#18) from renovate/font-awesome-6.x into master
Reviewed-on: #18
2025-02-09 14:29:49 -08:00
6c71be7a3a Merge pull request 'fix(deps): update rust crate xtracing to v0.1.3' (#16) from renovate/xtracing-0.x-lockfile into master
Reviewed-on: #16
2025-02-09 14:29:36 -08:00
77562505b4 Merge pull request 'fix(deps): update rust crate sqlx to v0.8.3' (#15) from renovate/sqlx-0.x-lockfile into master
Reviewed-on: #15
2025-02-09 14:29:24 -08:00
c83d3dcf1d Merge pull request 'fix(deps): update rust crate serde_json to v1.0.138' (#14) from renovate/serde_json-1.x-lockfile into master
Reviewed-on: #14
2025-02-09 14:29:03 -08:00
081077d2c2 Merge pull request 'fix(deps): update rust crate serde to v1.0.217' (#13) from renovate/serde-monorepo into master
Reviewed-on: #13
2025-02-09 14:28:53 -08:00
4cfc6a73fc Merge pull request 'fix(deps): update rust crate log to v0.4.25' (#11) from renovate/log-0.x-lockfile into master
Reviewed-on: #11
2025-02-09 14:28:43 -08:00
f1c132830f Merge pull request 'fix(deps): update rust crate clap to v4.5.28' (#10) from renovate/clap-4.x-lockfile into master
Reviewed-on: #10
2025-02-09 14:28:30 -08:00
5aff7c6e85 Merge pull request 'fix(deps): update rust crate cacher to v0.1.4' (#9) from renovate/cacher-0.x-lockfile into master
Reviewed-on: #9
2025-02-09 14:28:19 -08:00
2c09713e20 Merge pull request 'fix(deps): update rust crate async-trait to v0.1.86' (#7) from renovate/async-trait-0.x-lockfile into master
Reviewed-on: #7
2025-02-09 14:28:07 -08:00
3d544feeb5 Merge pull request 'fix(deps): update rust crate ammonia to v4' (#25) from renovate/ammonia-4.x into master
Reviewed-on: #25
2025-02-09 13:57:23 -08:00
5830ed0bb1 Merge branch 'master' into renovate/ammonia-4.x 2025-02-09 13:57:13 -08:00
83aed683f5 fix(deps): update rust crate sqlx to v0.8.3 2025-02-09 21:15:54 +00:00
72385b3987 Merge pull request 'fix(deps): update rust crate lol_html to v2' (#26) from renovate/lol_html-2.x into master
Reviewed-on: #26
2025-02-09 13:01:28 -08:00
f21893b52e Bumping version to 0.0.138 2025-02-09 12:52:36 -08:00
0b81529509 build-info: one last version bump 2025-02-09 12:52:23 -08:00
9790bbea83 Bumping version to 0.0.137 2025-02-09 12:49:53 -08:00
7aa620a9da Update all build-info versions to fix build 2025-02-09 12:49:25 -08:00
2e67db0b4e fix(deps): update rust crate css-inline to 0.14.0 2025-02-09 20:30:48 +00:00
cd777b2894 fix(deps): update rust crate lol_html to v2 2025-02-09 20:17:15 +00:00
049e9728a2 fix(deps): update rust crate ammonia to v4 2025-02-09 20:17:10 +00:00
0952cdf9cb fix(deps): update rust crate scraper to 0.22.0 2025-02-09 20:16:59 +00:00
5f4a4e81cb fix(deps): update rust crate itertools to 0.14.0 2025-02-09 20:16:54 +00:00
38c2c508e8 fix(deps): update rust crate graphql_client to 0.14.0 2025-02-09 20:16:48 +00:00
4cd3664e32 fix(deps): update rust crate gloo-net to 0.6.0 2025-02-09 20:16:44 +00:00
71996f6c48 chore(deps): update dependency font-awesome to v6.7.2 2025-02-09 20:16:33 +00:00
6e227de00f fix(deps): update rust crate xtracing to v0.1.3 2025-02-09 20:16:24 +00:00
3576e67af7 Merge pull request 'fix(deps): update rust crate reqwest to v0.12.12' (#12) from renovate/reqwest-0.x-lockfile into master
Reviewed-on: #12
2025-02-09 12:16:14 -08:00
19f0f60653 fix(deps): update rust crate serde_json to v1.0.138 2025-02-09 20:16:12 +00:00
3502eeb711 fix(deps): update rust crate serde to v1.0.217 2025-02-09 20:16:02 +00:00
fd770d03ab fix(deps): update rust crate reqwest to v0.12.12 2025-02-09 20:15:54 +00:00
d99b7ae34c fix(deps): update rust crate log to v0.4.25 2025-02-09 20:15:48 +00:00
f18aa8c8d4 fix(deps): update rust crate clap to v4.5.28 2025-02-09 20:15:35 +00:00
dcdcb5b5a3 fix(deps): update rust crate cacher to v0.1.4 2025-02-09 20:15:31 +00:00
884e4b5831 fix(deps): update rust crate async-trait to v0.1.86 2025-02-09 20:15:19 +00:00
5981356492 Merge pull request 'fix(deps): update rust crate async-graphql-rocket to v7.0.15' (#6) from renovate/async-graphql-rocket-7.x-lockfile into master
Reviewed-on: #6
2025-02-09 12:10:15 -08:00
386b6915c5 fix(deps): update rust crate async-graphql-rocket to v7.0.15 2025-02-09 20:09:39 +00:00
5a6f04536f Merge pull request 'chore(deps): update rust crate build-info-build to 0.0.39' (#2) from renovate/build-info-build-0.x into master
Reviewed-on: #2
2025-02-09 11:21:21 -08:00
ae1d9e6db7 Merge pull request 'fix(deps): update rust crate anyhow to v1.0.95' (#3) from renovate/anyhow-1.x-lockfile into master
Reviewed-on: #3
2025-02-09 11:21:03 -08:00
24d50c21f5 fix(deps): update rust crate anyhow to v1.0.95 2025-02-09 19:08:21 +00:00
b4d72da639 chore(deps): update rust crate build-info-build to 0.0.39 2025-02-09 19:08:17 +00:00
dacb258289 Merge pull request 'chore: Configure Renovate' (#1) from renovate/configure into master
Reviewed-on: #1
2025-02-09 11:06:35 -08:00
5c674d4603 Add renovate.json 2025-02-09 19:01:46 +00:00
2e9753e91d Bumping version to 0.0.136 2025-02-06 08:17:10 -08:00
971e1049c7 web: allow plaintext emails to wrap 2025-02-06 08:16:53 -08:00
11c76332f3 Bumping version to 0.0.135 2025-02-06 07:46:34 -08:00
52d03ae964 web: tweak figure bg color on hackaday 2025-02-06 07:46:13 -08:00
c4043f6c56 Bumping version to 0.0.134 2025-02-05 09:18:40 -08:00
dfbac38281 web: style blockquotes in emails 2025-02-05 09:18:05 -08:00
f857c38625 Bumping version to 0.0.133 2025-02-02 09:52:05 -08:00
23823cd85e web: provide CSS overrides for email matching news posts 2025-02-02 09:51:27 -08:00
30b5d0ff9f Bumping version to 0.0.132 2025-01-30 20:19:21 -08:00
60a3b1ef88 web: remove accidentally committed line 2025-01-30 20:18:36 -08:00
a46390d110 Bumping version to 0.0.131 2025-01-30 17:45:35 -08:00
5baac0c77a web: fix width overflow on mobile and maybe progress bar 2025-01-30 17:45:14 -08:00
e6181d41ed web: address a bunch of dead code lint 2025-01-30 15:24:11 -08:00
6a228cfd5e Bumping version to 0.0.130 2025-01-30 14:16:30 -08:00
8d81067206 cargo sqlx prepare 2025-01-30 14:16:29 -08:00
b2e47a9bd4 server: round-robin by site when indexing searches 2025-01-30 14:16:12 -08:00
4eaf50cde4 Bumping version to 0.0.129 2025-01-30 13:55:52 -08:00
f20afe5447 update sqlx prepare 2025-01-30 13:55:38 -08:00
53093f4cce Bumping version to 0.0.128 2025-01-30 13:52:55 -08:00
9324a34d31 cargo sqlx prepare 2025-01-30 13:52:54 -08:00
eecc4bc3ef server: strip style & script tags, also handle some retryable errors on slurp 2025-01-30 13:52:22 -08:00
795029cb06 Bumping version to 0.0.127 2025-01-29 17:25:55 -08:00
bc0135106f server: error when get request has a bad response code 2025-01-29 17:25:26 -08:00
bd2803f81c Bumping version to 0.0.126 2025-01-29 17:10:42 -08:00
215addc2c0 cargo sqlx prepare 2025-01-29 17:10:41 -08:00
69f8e24689 server: index newest news posts first 2025-01-29 17:10:26 -08:00
0817a7a51b Bumping version to 0.0.125 2025-01-29 17:04:16 -08:00
200933591a cargo sqlx prepare 2025-01-29 17:04:15 -08:00
8b7c819b17 server: only index 100 search summaries at a time 2025-01-29 17:03:47 -08:00
dce433ab5a Bumping version to 0.0.124 2025-01-29 16:53:59 -08:00
eb4f2d8b5d server: filter out bad urls when indexing search summary 2025-01-29 16:53:38 -08:00
2008457911 Bumping version to 0.0.123 2025-01-29 16:13:50 -08:00
f6b57e63fd cargo sqlx prepare 2025-01-29 16:13:50 -08:00
d681612e8e server: index all search summaries on refresh 2025-01-29 16:13:44 -08:00
80454cbc7e Bumping version to 0.0.122 2025-01-29 15:44:05 -08:00
78cf59333e cargo sqlx prepare 2025-01-29 15:44:04 -08:00
ab47f32b52 server: fetch search summaries in parallel 2025-01-29 15:43:46 -08:00
d9d58afed9 Bumping version to 0.0.121 2025-01-29 15:24:55 -08:00
d01f9a7e08 cargo sqlx prepare 2025-01-29 15:24:54 -08:00
c6aabf88b9 server: sample DB for missing indexes, should prevent duplication from separate threads 2025-01-29 14:42:59 -08:00
29bf6d9b6d Bumping version to 0.0.120 2025-01-29 14:08:55 -08:00
92bf45bd15 cargo sqlx prepare 2025-01-29 14:08:54 -08:00
12c8e0e33b server: use fetched contents of news for search index 2025-01-29 14:08:20 -08:00
c7aa32b922 Bumping version to 0.0.119 2025-01-28 09:34:56 -08:00
94be4ec572 web: add archive buttons, and adjust when text on buttons is shown 2025-01-28 09:34:36 -08:00
66c299bc4c Bumping version to 0.0.118 2025-01-27 15:48:12 -08:00
d5c4176392 cargo sqlx prepare 2025-01-27 15:48:11 -08:00
bd00542c28 server: use clean_summary field instead of summary 2025-01-27 15:47:55 -08:00
19f029cb6b Bumping version to 0.0.117 2025-01-27 14:15:00 -08:00
198db1492a server: add another The Onion slurp config 2025-01-27 14:14:46 -08:00
f6665b6b6e Bumping version to 0.0.116 2025-01-27 14:01:30 -08:00
ee93d725ba web & server: finish initial tailwind rewrite 2025-01-27 14:00:46 -08:00
70fb635eda server: index on nzb_posts created_at, attempt to speed up homepage 2025-01-27 13:18:36 -08:00
b9fbefe05c server: format chrome css 2025-01-27 13:17:22 -08:00
46f823baae server: use local slurp cache separate from production 2025-01-27 13:16:55 -08:00
cc1e998ec5 web: style version chart 2025-01-26 16:01:35 -08:00
fb73d8272e web: update style for rendering emails, including attachments 2025-01-26 15:56:08 -08:00
87321fb669 web: update stylings for removable tag chiclets 2025-01-26 14:02:39 -08:00
44b60d5070 web: style checkboxes, tweak mobile search bar width 2025-01-26 13:42:20 -08:00
89897aa48f web: style search toolbar 2025-01-26 12:24:06 -08:00
b2879211e4 web: much nicer tag list styling with flex box 2025-01-26 10:58:27 -08:00
6b3567fb1b web: style tag list 2025-01-26 09:42:32 -08:00
c27bcac549 web: switch to debug build and enable minimal optimizations to make wasm work 2025-01-26 09:32:06 -08:00
25d31a6ce7 web: only use one view function, desktop/tablet/mobile handled in CSS 2025-01-26 09:31:44 -08:00
ea280dd366 web: stub out all C![] that need porting to tailwind 2025-01-25 16:56:44 -08:00
9842c8c99c server: add option to inline CSS before slurping contents 2025-01-25 16:09:05 -08:00
906ebd73b2 cargo: don't default to xinu repo, that was misguided 2025-01-25 16:05:05 -08:00
de95781ce7 More lint 2025-01-24 09:38:56 -08:00
c58234fa2e Lint 2025-01-24 09:37:49 -08:00
4099bbe732 Bumping version to 0.0.115 2025-01-19 17:22:37 -08:00
c693d4e78a server: strip html from search index of summaries 2025-01-19 17:22:24 -08:00
f90ff72316 server: fix tantivy/newsreader search bug 2025-01-19 17:22:20 -08:00
bed6ae01f2 Bumping version to 0.0.114 2025-01-19 16:50:50 -08:00
087d6b9a60 Use registry version of formerly git dependencies 2025-01-19 16:50:14 -08:00
b04caa9d5d Bumping version to 0.0.113 2025-01-17 15:51:39 -08:00
17b1125ea3 server: Use crate version of cacher 2025-01-17 15:51:28 -08:00
a8ac79d396 Bumping version to 0.0.112 2025-01-16 16:09:28 -08:00
30cbc260dc web: version bump wasm-bindgen-cli 2025-01-16 16:09:06 -08:00
4601b7e6d3 Bumping version to 0.0.111 2025-01-15 12:27:33 -08:00
28b6f565fd update cacher dependency 2025-01-15 12:27:29 -08:00
48b63b19d5 Bumping version to 0.0.110 2025-01-14 20:55:53 -08:00
184afbb4ee update cacher dependency 2025-01-14 20:55:49 -08:00
f6217810ea Bumping version to 0.0.109 2025-01-14 16:22:24 -08:00
46e2de341b update cacher dependency 2025-01-14 16:22:20 -08:00
9c56fde0b6 Bumping version to 0.0.108 2025-01-14 12:05:38 -08:00
2051e5ebf2 cargo sqlx prepare 2025-01-14 12:05:37 -08:00
5a997e61da web & server: add support for email photos 2025-01-14 12:05:03 -08:00
f27f0deb38 Revert "Remove DB tables that don't seem to work"
This reverts commit 70f437b939.
2025-01-13 21:03:56 -08:00
70f437b939 Remove DB tables that don't seem to work 2025-01-13 20:50:19 -08:00
59648a1b25 Bumping version to 0.0.107 2025-01-12 16:35:17 -08:00
76482c6c15 server: make pagination slightly less bad 2025-01-12 16:35:11 -08:00
de23bae8bd server: add request_id to all graphql logging 2025-01-12 11:40:31 -08:00
e07c0616a2 Bumping version to 0.0.106 2025-01-12 09:26:23 -08:00
13a7de4956 web: refactor mark read logic to be two phases 2025-01-12 09:25:44 -08:00
9ce0aacab0 Bumping version to 0.0.105 2025-01-12 08:34:36 -08:00
ae502a7dfe Bumping version to 0.0.104 2025-01-02 15:19:24 -08:00
947c5970d8 update xtracing dependency 2025-01-02 15:19:17 -08:00
686d163cf6 update xtracing dependency 2025-01-02 15:18:49 -08:00
7c720e66f9 Bumping version to 0.0.103 2024-12-28 15:10:17 -08:00
1029fd7aa2 update cacher dependency 2024-12-28 15:10:12 -08:00
61e59ea315 Bumping version to 0.0.102 2024-12-28 15:09:21 -08:00
5047094bd7 update xtracing dependency 2024-12-28 15:09:16 -08:00
28bd9a9d89 Bumping version to 0.0.101 2024-12-28 15:08:52 -08:00
4b327eeccc update xtracing dependency 2024-12-28 15:08:48 -08:00
d13b5477a5 Bumping version to 0.0.100 2024-12-28 15:08:28 -08:00
8cad404098 update xtracing dependency 2024-12-28 15:08:24 -08:00
23de7186d6 Bumping version to 0.0.99 2024-12-28 15:06:37 -08:00
a26559a07e Bumping version to 0.0.98 2024-12-28 15:04:53 -08:00
1bc7ad9b95 update xtracing dependency 2024-12-28 15:04:48 -08:00
1ac844c08d Bumping version to 0.0.97 2024-12-28 15:04:02 -08:00
d7f7954e59 Bumping version to 0.0.96 2024-12-28 15:02:21 -08:00
ba16e537e6 Bumping version to 0.0.95 2024-12-28 15:00:36 -08:00
60304a23cc update xtracing dependency 2024-12-28 15:00:30 -08:00
ce6aa7d167 Bumping version to 0.0.94 2024-12-28 09:09:41 -08:00
fb55d87876 update xtracing dependency 2024-12-28 09:09:27 -08:00
63374871ac Bumping version to 0.0.93 2024-12-28 08:44:51 -08:00
405dcc5ca6 update cacher dependency 2024-12-28 08:44:47 -08:00
1544405d3a Bumping version to 0.0.92 2024-12-28 08:43:49 -08:00
3b547f6925 update cacher dependency 2024-12-28 08:43:41 -08:00
777f33e212 notmuch: add instrumentation to most public methods 2024-12-26 11:12:47 -08:00
7c7a8c0dcb Bumping version to 0.0.91 2024-12-25 16:22:36 -08:00
6c2722314b server: fix compile problem with new PG schema 2024-12-25 16:22:19 -08:00
7827c24016 Bumping version to 0.0.90 2024-12-25 16:19:16 -08:00
043e46128a cargo sqlx prepare 2024-12-25 16:19:15 -08:00
dad30357ac server: enusre post.link is not null and not empty 2024-12-25 10:12:33 -08:00
4c6b9cde39 Bumping version to 0.0.89 2024-12-25 08:03:13 -08:00
ffb210babb server: ensure uniqueness on post links 2024-12-25 08:02:36 -08:00
145d1c1787 Bumping version to 0.0.88 2024-12-21 16:52:24 -08:00
1708526e33 update xtracing dependency 2024-12-21 16:52:19 -08:00
f8f9b753a6 Bumping version to 0.0.87 2024-12-21 16:23:13 -08:00
7fbb0e0f43 update xtracing dependency 2024-12-21 16:23:09 -08:00
2686670df7 Bumping version to 0.0.86 2024-12-21 16:21:19 -08:00
732fb5054a update xtracing dependency 2024-12-21 16:21:14 -08:00
2abfbda2f0 Bumping version to 0.0.85 2024-12-21 16:19:42 -08:00
cce693174e update xtracing dependency 2024-12-21 16:19:37 -08:00
bec7ee40b4 Bumping version to 0.0.84 2024-12-21 13:18:26 -08:00
79a6245773 update xtracing dependency 2024-12-21 13:18:22 -08:00
3ae1c3fdff Bumping version to 0.0.83 2024-12-21 13:16:50 -08:00
eab96b3f84 update xtracing dependency 2024-12-21 13:16:44 -08:00
07b8db317b cargo sqlx prepare 2024-12-21 13:16:18 -08:00
9debec8daa Bumping version to 0.0.82 2024-12-21 13:10:22 -08:00
b129b99fd9 cargo sqlx prepare 2024-12-21 13:10:21 -08:00
a397bcf190 update xtracing dependency 2024-12-21 13:10:16 -08:00
13c80fe68f update xtracing dependency 2024-12-21 13:08:01 -08:00
438ab0015e update xtracing dependency 2024-12-21 13:07:14 -08:00
93f5145937 update xtracing dependency 2024-12-21 13:06:59 -08:00
36fcc349ec update xtracing dependency 2024-12-21 13:05:31 -08:00
63a1919872 update xtracing dependency 2024-12-21 13:02:59 -08:00
5b6d18bdbc Bumping version to 0.0.81 2024-12-20 09:25:51 -08:00
868d2fb434 xtracing version bump 2024-12-20 09:25:46 -08:00
6ad66a35e7 Bumping version to 0.0.80 2024-12-20 09:18:27 -08:00
cd750e7267 Update xtracing 2024-12-20 09:16:41 -08:00
40be07cb07 Bumping version to 0.0.79 2024-12-20 09:06:45 -08:00
e794a902dd server: clean up some renamed imports 2024-12-20 09:06:35 -08:00
94576e98fc Bumping version to 0.0.78 2024-12-20 09:06:08 -08:00
b7dcb2e875 server: rename crate and binary to letterbox-server 2024-12-20 09:05:35 -08:00
aa9a243894 Bumping version to 0.0.77 2024-12-20 08:43:47 -08:00
1911367aeb cargo update 2024-12-20 08:43:37 -08:00
93bb4a27b9 Bumping version to 0.0.76 2024-12-19 18:44:31 -08:00
0456efeed4 cargo sqlx prepare 2024-12-19 18:44:30 -08:00
3ac2fa290f server: use git version of xtracing 2024-12-19 18:44:13 -08:00
e7feb73f6f lint 2024-12-19 18:38:43 -08:00
5ddb4452ff email2db: stub CLI 2024-12-19 18:35:46 -08:00
760f90762d server: refer to async_graphql extensions through extensions module 2024-12-19 18:35:03 -08:00
51154044cc WIP 2024-12-19 12:56:53 -08:00
06c5cb6cbf Update offline sqlx files on build 2024-12-19 12:50:10 -08:00
0dc1f2cebe Bumping version to 0.0.75 2024-12-19 11:35:18 -08:00
0dec7aaf0e web: pin wasm-bindgen 2024-12-19 11:35:00 -08:00
6fa8d1856a Revert "web: fix breakage do to update in dependency"
This reverts commit 80d23204fe.
2024-12-19 11:34:33 -08:00
95a0279c68 Bumping version to 0.0.74 2024-12-19 11:04:55 -08:00
80d23204fe web: fix breakage do to update in dependency 2024-12-19 11:04:39 -08:00
f45123d6d9 Bumping version to 0.0.73 2024-12-19 10:53:51 -08:00
503913c54a Bumping version to 0.0.72 2024-12-19 10:46:47 -08:00
c4627a13b6 cargo sqlx prepare 2024-12-19 10:46:39 -08:00
e4427fe725 Bumping version to 0.0.71 2024-12-19 10:44:15 -08:00
78f5f00225 cargo update 2024-12-19 10:44:05 -08:00
c6fc34136a Version bump sqlx 2024-12-19 10:44:05 -08:00
1a270997c8 Update xtracing 2024-12-19 10:38:56 -08:00
390fbcceac Bumping version to 0.0.70 2024-12-17 13:57:25 -08:00
d7214f4f29 server: move notmuch refresh out of tantivy cfg block for refresh 2024-12-17 13:57:06 -08:00
b9aaf87dc2 Bumping version to 0.0.69 2024-12-17 09:38:26 -08:00
5ee9d754ba server: actually disable tantivy 2024-12-17 09:38:19 -08:00
dc04d54455 cargo sqlx prepare 2024-12-17 09:34:03 -08:00
9f730e937d Bumping version to 0.0.68 2024-12-17 09:32:13 -08:00
13eaf33b1a server: add postgres based newsreader search and disable tantivy 2024-12-17 09:31:51 -08:00
e36f4f97f9 server: run DB migrations on startup 2024-12-16 19:21:58 -08:00
092d5781ca Bumping version to 0.0.67 2024-12-16 19:21:34 -08:00
0697a5ea41 server: more instrumentation 2024-12-16 19:21:05 -08:00
607e9e2251 Bumping version to 0.0.66 2024-12-16 08:56:24 -08:00
c547170efb server: address lint 2024-12-16 08:56:16 -08:00
0222985f4d server: instrument newsreader impl 2024-12-16 08:56:05 -08:00
94c03a9c7c Bumping version to 0.0.65 2024-12-16 08:34:53 -08:00
4f4e474e66 server: explicitly reload tantivy reader after commit 2024-12-16 08:34:35 -08:00
7a1dec03a3 Bumping version to 0.0.64 2024-12-15 16:26:38 -08:00
f49bc071c2 server: version bump xtracing 2024-12-15 16:26:22 -08:00
8551f0c756 Bumping version to 0.0.63 2024-12-15 15:43:56 -08:00
ac4aaeb0f7 server: warn on failure to open tantivy 2024-12-15 15:43:44 -08:00
4ad963c3be Bumping version to 0.0.62 2024-12-15 15:18:36 -08:00
7c943afc2b server: attempt concurrency with graphql::search and fail 2024-12-15 15:09:41 -08:00
39ea5c5458 Bumping version to 0.0.61 2024-12-15 14:46:53 -08:00
6d8b2de608 server: improve tantivy performance by reusing IndexReader
Also improve a bunch of trace logging
2024-12-15 14:46:10 -08:00
05cdcec244 notmuch: improved error handling and logging 2024-12-15 14:44:02 -08:00
a0eb8dcba6 server: add TODO 2024-12-14 11:56:33 -08:00
9fbfa378bb Bumping version to 0.0.60 2024-12-14 10:09:48 -08:00
872771b02a server: add tracing for graphql handling 2024-12-14 10:09:33 -08:00
416d82042f Bumping version to 0.0.59 2024-12-10 09:13:22 -08:00
a0eb291371 web: most post favicon more cachable 2024-12-10 09:13:11 -08:00
4c88ee18d3 Bumping version to 0.0.58 2024-12-09 13:17:09 -08:00
410e582b44 web: use favicon for avatar when viewing a post 2024-12-09 13:16:55 -08:00
a3f720a51e Bumping version to 0.0.57 2024-12-08 18:05:12 -08:00
962b3542ce web: show email address on hover of name in message view 2024-12-08 18:03:20 -08:00
a6f0971f0f Bumping version to 0.0.56 2024-11-13 17:43:16 -08:00
21789df60a server: handle attachements with name in content-type not disposition 2024-11-13 17:42:53 -08:00
584ff1504d cargo fmt to catch unformated code while LSP was misconfigured 2024-11-03 08:33:10 -08:00
caff1a1ed3 web: remove unnecessary move 2024-10-30 20:07:32 -07:00
d7b4411017 web: update cargo edition 2024-10-30 19:59:06 -07:00
66ada655fc Bumping version to 0.0.55 2024-10-29 17:16:58 -07:00
8dea1f1bd6 web: fix styling on news post tags to match email 2024-10-29 17:16:45 -07:00
e7a865204d Bumping version to 0.0.54 2024-10-27 12:27:34 -07:00
3138379e7d web: add tag when viewing news posts 2024-10-27 12:27:16 -07:00
7828fa0ac8 server: add slurper config for rustacean station 2024-10-27 12:15:43 -07:00
b770bb8986 server: add slurp config for grafana 2024-10-27 12:14:15 -07:00
07c0150d3e Bumping version to 0.0.53 2024-10-27 12:03:25 -07:00
f678338822 server: lint, including bug fix 2024-10-27 12:03:16 -07:00
6e15e69254 server: handle forwarded rfc822 messages 2024-10-27 12:02:00 -07:00
2671a3b787 Bumping version to 0.0.52 2024-10-27 10:56:11 -07:00
93073c9602 server: fix pagination counts for tantivy results 2024-10-27 10:55:49 -07:00
88f8a9d537 Bumping version to 0.0.51 2024-10-13 17:40:35 -07:00
b75b298a9d web: match email header styling when viewing post 2024-10-13 17:40:20 -07:00
031b8ce80e Bumping version to 0.0.50 2024-10-03 09:21:48 -07:00
b0ceba3bcf web: consistent html between open/close header, move padding into header code 2024-10-03 09:21:12 -07:00
e5f5b8ff3c Bumping version to 0.0.49 2024-10-03 09:04:03 -07:00
afb1d291ec web: fix right justify of read icon/timestamp on closed message header 2024-10-03 09:03:22 -07:00
55b46ff929 Bumping version to 0.0.48 2024-10-01 17:20:01 -07:00
58acd8018a web: more dense email headers 2024-10-01 17:19:52 -07:00
e0d0ede2ce Bumping version to 0.0.47 2024-10-01 15:12:20 -07:00
ac46b0e4d0 web: change up spacing in email headers. Increase density 2024-10-01 15:12:02 -07:00
e12ea2d7e4 Bumping version to 0.0.46 2024-09-29 19:17:07 -07:00
5f052facdf web: fix styling of envelope on closed headers 2024-09-29 19:16:51 -07:00
4476749203 Bumping version to 0.0.45 2024-09-29 19:05:59 -07:00
0fa860bc71 web: show email address when now name present 2024-09-29 19:05:46 -07:00
b858b23584 Bumping version to 0.0.44 2024-09-29 18:03:05 -07:00
6500e60c40 web: remove dead code 2024-09-29 18:02:45 -07:00
efc991923d Bumping version to 0.0.43 2024-09-29 17:56:39 -07:00
0b5e057fe6 web: fix spacing when there are few To/CC 2024-09-29 17:56:25 -07:00
822e1b0a9c Bumping version to 0.0.42 2024-09-29 17:15:57 -07:00
4f21814be0 web: successfully rewrite some bits in tailwind 2024-09-29 17:15:28 -07:00
17da489229 web: WIP tailwind integration 2024-09-29 16:43:29 -07:00
5b8639b80f Bumping version to 0.0.41 2024-09-29 16:41:36 -07:00
6c9ef912e6 server: don't touch tantivy if no uids reindexed 2024-09-29 16:41:13 -07:00
da636ca1f3 Bumping version to 0.0.40 2024-09-29 16:28:37 -07:00
7880eddccd Bumping version to 0.0.39 2024-09-29 16:28:25 -07:00
3ec1741f10 web & server: using tantivy for news post search 2024-09-29 16:28:05 -07:00
f36d1e0c29 server: continue if db path missing on create_news_db 2024-09-28 12:29:12 -07:00
ebf32a9905 server: WIP tantivy integration 2024-09-28 12:29:12 -07:00
005a457348 Bumping version to 0.0.38 2024-09-28 12:28:53 -07:00
a89a279764 notmuch: use faster, but inaccurate message count 2024-09-28 12:28:41 -07:00
fbc426f218 Bumping version to 0.0.37 2024-09-28 12:23:29 -07:00
27b480e118 web: try alternative for clearing screen on build 2024-09-28 12:22:35 -07:00
dee6ff9ba0 Bumping version to 0.0.36 2024-09-28 12:06:12 -07:00
73bdcd5441 server: add pjpeg support for attachments 2024-09-28 12:06:00 -07:00
64a38e024d Bumping version to 0.0.35 2024-09-28 11:18:39 -07:00
441b40532f Bumping version to 0.0.34 2024-09-28 11:18:37 -07:00
bfb6a6226d Bumping version to 0.0.33 2024-09-28 11:18:37 -07:00
f464585fad web: tweak hr styling 2024-09-28 11:18:37 -07:00
3fe61f8b09 web: clear screen on rebuild 2024-09-28 11:18:37 -07:00
43b3625656 server: join slurped parts with <hr> elements 2024-09-28 11:16:10 -07:00
6505c90f32 Bumping version to 0.0.32 2024-09-26 16:28:02 -07:00
104eb189fe web: shrink <hr> margins 2024-09-26 16:27:50 -07:00
b70e0018d7 Bumping version to 0.0.31 2024-09-25 19:46:15 -07:00
d962d515f5 web: shorten outbound link on news post 2024-09-25 19:45:52 -07:00
3c8d7d4f81 server: move tantivy code to separate mod 2024-09-22 10:26:45 -07:00
d1604f8e70 server: remove done TODO 2024-09-21 18:48:25 -07:00
6f07817c0e Bumping version to 0.0.30 2024-09-21 13:01:27 -07:00
0ac959ab76 server: add slurp config for ingowald 2024-09-21 13:01:17 -07:00
62b17bd6a6 Bumping version to 0.0.29 2024-09-20 08:56:58 -07:00
c0bac99d5a server: add slurp config for zsa blog 2024-09-20 08:56:45 -07:00
3b69c5e74b Bumping version to 0.0.28 2024-09-19 17:06:03 -07:00
539fd469cc server: create index when missing 2024-09-19 17:05:47 -07:00
442688c35c web: lint 2024-09-19 16:54:18 -07:00
da27f02237 Bumping version to 0.0.27 2024-09-19 16:52:35 -07:00
9460e354b7 server: cargo sqlx prepare 2024-09-19 16:52:26 -07:00
6bab128ed9 Bumping version to 0.0.26 2024-09-19 16:33:50 -07:00
3856b4ca5a server: try different cacher url 2024-09-19 16:33:40 -07:00
bef39eefa5 Bumping version to 0.0.25 2024-09-19 16:08:20 -07:00
b0366c7b4d server: try non-https to see if that works 2024-09-19 16:07:59 -07:00
ca02d84d63 Bumping version to 0.0.24 2024-09-19 16:01:55 -07:00
461d5de886 server: change internal git url 2024-09-19 16:01:41 -07:00
f8134dad7a Bumping version to 0.0.23 2024-09-19 15:53:56 -07:00
30f510bb03 server: WIP tantivy, cache slurps, use shared::compute_color, 2024-09-19 15:53:09 -07:00
e7cbf9cc45 shared: remove debug logging 2024-09-19 13:54:47 -07:00
5108213af5 web: use shared compute_color 2024-09-19 13:49:24 -07:00
d148f625ac shared: add compute_color 2024-09-19 13:48:56 -07:00
a9b8f5a88f Bumping version to 0.0.22 2024-09-16 20:00:16 -07:00
539b584d9b web: fix broken build 2024-09-16 20:00:06 -07:00
2f8d83fc4b Bumping version to 0.0.21 2024-09-16 19:52:28 -07:00
86ee1257fa web: better progress bar 2024-09-16 19:52:20 -07:00
03f1035e0e Bumping version to 0.0.20 2024-09-12 22:38:18 -07:00
bd578191a8 web: add scroll to top button and squelch some debug logging 2024-09-12 22:37:58 -07:00
d4fc2e2ef1 Bumping version to 0.0.19 2024-09-12 15:41:01 -07:00
cde30de81c web: explicitly set progress to zero when not in thread/news view 2024-09-12 15:40:42 -07:00
96be74e3ee Bumping version to 0.0.18 2024-09-12 15:32:30 -07:00
b78d34b27e web: disable bulma styling for .number 2024-09-12 15:32:18 -07:00
b4b64c33a6 Bumping version to 0.0.17 2024-09-12 10:07:00 -07:00
47b1875022 server: tweak cloudflare and prusa slurp config 2024-09-12 10:06:46 -07:00
b06cbd1381 Bumping version to 0.0.16 2024-09-12 10:03:26 -07:00
9e35f8ca6c web: fix <em> looking like a button 2024-09-12 10:01:58 -07:00
8eaefde67d Bumping version to 0.0.15 2024-09-12 09:28:14 -07:00
d5a3324837 server: slurp config for prusa blog and squelch some info logging 2024-09-12 09:27:57 -07:00
f5c90d8770 Bumping version to 0.0.14 2024-09-11 11:46:04 -07:00
825a125a62 web: redox specific styling 2024-09-11 11:45:53 -07:00
da7cf37dae Bumping version to 0.0.13 2024-09-11 11:41:27 -07:00
1985ae1f49 server: add slurp configs for facebook and redox 2024-09-11 11:41:09 -07:00
91eb3019f9 Bumping version to 0.0.12 2024-09-09 20:31:07 -07:00
66e8e00a9b web: remove dead code 2024-09-09 20:21:51 -07:00
4b8923d852 web: more accurate reading progress bar 2024-09-09 20:21:13 -07:00
baba720749 Bumping version to 0.0.11 2024-09-02 13:36:18 -07:00
1ec22599cc web: make pre blocks look like code blocks in email 2024-09-02 13:35:58 -07:00
c69017bc36 Bumping version to 0.0.10 2024-09-02 13:19:11 -07:00
48bf57fbbe web: more pleasant color scheme for code blocks in email 2024-09-02 13:18:49 -07:00
3491856784 Bumping version to 0.0.9 2024-09-01 16:17:35 -07:00
f887c15b46 web: address lint 2024-09-01 16:17:27 -07:00
7786f850d1 Bumping version to 0.0.8 2024-09-01 16:16:09 -07:00
cad778734e web: rename Msg::Reload->Refresh and create proper Reload 2024-09-01 16:15:38 -07:00
1210f7038a Bumping version to 0.0.7 2024-09-01 16:09:14 -07:00
f9ab7284a3 web: remove obsolete Makefile 2024-09-01 16:09:04 -07:00
100865c923 server: use same html cleanup idiom in nm as we do in newreader 2024-09-01 16:08:25 -07:00
b8c1710a83 dev: watch for git commits and rebuild on change 2024-09-01 16:07:22 -07:00
215b8cd41d shared: ignore dirty, if git is present we're developing
When developing dirty can get out of between client and server if you're
only doing development in one.
2024-09-01 15:57:02 -07:00
487d7084c3 Bumping version to 0.0.6 2024-09-01 15:48:41 -07:00
b1e761b26f web: don't show progress bar until 400px have scrolled 2024-09-01 15:48:11 -07:00
3efe90ca21 Update release makefile 2024-09-01 15:40:19 -07:00
61649e1e04 Bumping version to 0.0.5 2024-09-01 15:38:39 -07:00
13ac352a10 Helpers to bump version number 2024-09-01 15:37:00 -07:00
5ca7a25e8d Bumping version to 0.0.4 2024-09-01 15:36:48 -07:00
7bb8ef0938 Bumping version to :?} 2024-09-01 15:36:36 -07:00
5c55a290ac Bumping version to :?} 2024-09-01 15:34:53 -07:00
4e3e1b075d Setting crate version to 0.2.0-a8c5a16 2024-09-01 15:30:37 -07:00
a8c5a164ff web: clean up version string and reload on mismatch 2024-09-01 15:02:34 -07:00
1f393f1c7f Add server and client build versions 2024-09-01 14:55:51 -07:00
fdaff70231 server: improve cloudflare and grafana image and iframe rendering 2024-09-01 11:05:07 -07:00
7218c13b9e server: address lint 2024-08-31 16:18:47 -07:00
934cb9d91b web: address lint 2024-08-31 16:11:49 -07:00
4faef5e017 web: add scrollbar for read progress 2024-08-31 16:08:06 -07:00
5c813e7350 web: style improvements for figure captions 2024-08-31 15:04:19 -07:00
fb754469ce web: let pullquotes on grafana blog be full width 2024-08-31 14:46:38 -07:00
548b5a0ab0 server: extract image title and alt attributes into figure captions 2024-08-31 14:43:04 -07:00
f77d0776c4 web: style tweaks for <em> 2024-08-31 14:42:19 -07:00
e73f70af8f Fix new post read/unread handling 2024-08-31 13:49:03 -07:00
a9e6120f81 web: don't make slashdot pull quotes italic 2024-08-31 13:36:21 -07:00
090a010a63 server: fix thread id for news posts 2024-08-31 13:23:25 -07:00
85c762a297 web: add class for mail vs news-post bodies 2024-08-31 11:54:19 -07:00
a8d5617cf2 Treat email and news posts as distinct types on the frontend and backend 2024-08-31 11:40:06 -07:00
760cec01a8 Refactor thread responses into an enum.
Lays ground work for different types of views, i.e. email, news, docs, etc.
2024-08-26 21:48:53 -07:00
446fcfe37f server: fix url for graphiql 2024-08-26 21:48:25 -07:00
71de3ef8ae server: add ability to slurp contents from site 2024-08-25 19:37:53 -07:00
d98d429b5c notmuch: add TODO 2024-08-25 19:37:37 -07:00
cf5a6fadfd server: sort dependencies 2024-08-24 09:26:52 -07:00
9a078cd238 server: only add "view on site" link if it's not in the html body 2024-08-19 10:57:09 -07:00
a81a803cca server: include default chrome CSS as a baseline for news threads 2024-08-19 10:47:38 -07:00
816587b688 server: fix download of chrome default CSS 2024-08-19 10:47:14 -07:00
4083c58bbd server: add chrome default styles
From:
https://source.chromium.org/chromium/chromium/src/+/main:third_party/blink/renderer/core/html/resources/html.css
2024-08-19 10:31:59 -07:00
8769e5acd4 server: fix counting issue w/ notmuch (messages vs threads) 2024-08-18 14:18:15 -07:00
3edf9fdb5d web: fix age display when less than 1 minute 2024-08-18 12:55:39 -07:00
ac0ce29c76 web: preserve checked boxes on search refresh 2024-08-18 11:04:31 -07:00
5279578c64 server: fix inline image loading 2024-08-17 16:33:53 -07:00
632f64261e server: fix notmuch paging bug 2024-08-15 16:21:27 -07:00
b5e25eef78 server: fix paging if only notmuch results are found 2024-08-15 14:58:23 -07:00
8a237bf8e1 server: add link to news posts back to original article 2024-08-12 21:14:32 -07:00
c5def6c0e3 web: allow clicking anywhere in the subject line in search results 2024-08-12 20:54:16 -07:00
d1cfc77148 server: more news title/body cleanup, and don't search news so much 2024-08-12 20:53:48 -07:00
c314e3c798 web: make whole row of search results clickable
No longer allow searching by tag by clicking on chiclet
2024-08-06 21:37:38 -07:00
7c5ef96ff0 server: fix paging bug where p1->p2->p1 wouldn't show consistent results 2024-08-06 21:15:10 -07:00
474cf38180 server: cargo sqlx prepare 2024-08-06 20:55:05 -07:00
e81a452dfb web: scroll to top when viewing a new tag 2024-08-06 20:54:25 -07:00
e570202ba2 Merge news and email search results 2024-08-06 20:44:25 -07:00
a84c9f0eaf server: address some lint 2024-08-05 15:54:26 -07:00
530bd8e350 Inline mvp and custom override CSS when rendering RSS posts 2024-08-05 15:47:31 -07:00
359e798cfa server: going with mvp.css not normalize.css 2024-08-04 21:23:05 -07:00
d7d257a6b5 https://andybrewer.github.io/mvp/mvp.css 2024-08-04 21:22:34 -07:00
9ad9ff6879 https://necolas.github.io/normalize.css/8.0.1/normalize.css 2024-08-03 21:31:09 -07:00
56bc1cf7ed server: escape RSS feeds that are HTML escaped 2024-08-03 11:29:20 -07:00
e0863ac085 web: more robust avatar intial filtering 2024-07-29 17:29:15 -07:00
d5fa89b38c web: show tag list in all modalities. WIP 2024-07-29 08:48:44 -07:00
605af13a37 web: monospace font for plain text emails 2024-07-29 08:32:28 -07:00
3838cbd6e2 cargo fix 2024-07-24 11:08:47 -07:00
c76df0ef90 web: update copy icon in more places 2024-07-24 11:06:38 -07:00
cd77d302df web: small icon tweak for copying email addresses 2024-07-24 11:03:32 -07:00
71348d562d version bump 2024-07-24 11:03:26 -07:00
b6ae46db93 Move cargo config up a directory 2024-07-22 16:56:13 -07:00
6cb84054ed Only build server by default 2024-07-22 16:48:47 -07:00
7b511c1673 Fix cleanhtml build 2024-07-22 16:41:14 -07:00
bfd5e12bea Make URL joining more robust 2024-07-22 16:39:59 -07:00
ad8fb77857 Add copy to clipboard links to from/to/cc addresses 2024-07-22 16:04:25 -07:00
831466ddda Add mark read/unread support for news 2024-07-22 14:43:05 -07:00
4ee34444ae Move thread: and id: prefixing to server side.
This paves way for better news: support
2024-07-22 14:26:48 -07:00
879ddb112e Remove some logging and fix a comment 2024-07-22 14:26:24 -07:00
331fb4f11b Fix build 2024-07-22 12:19:45 -07:00
4e5275ca0e cargo sqlx prepare 2024-07-22 12:19:38 -07:00
1106377550 Normalize links and images based on post's URL 2024-07-22 11:27:15 -07:00
b5468bced2 Implement pagination for newsreader 2024-07-22 09:28:12 -07:00
01cbe6c037 web: set reasonable defaults on front page requests 2024-07-22 08:28:12 -07:00
d0a02c2f61 cargo fix lint 2024-07-22 08:19:07 -07:00
c499672dde Rollback attempt to make unread tag queries faster for newsreader 2024-07-22 08:17:46 -07:00
3aa0b94db4 Fix bug in pagination when more than SEARCH_RESULTS_PER_PAGE returned 2024-07-22 08:13:45 -07:00
cdb64ed952 Remove old search URLs 2024-07-22 07:25:15 -07:00
834efc5c94 Handle needs_unread on tag query. Move News to top of tag list 2024-07-22 07:24:28 -07:00
79db94f67f Add pretty site names to search and thread views 2024-07-21 20:50:50 -07:00
ec41f840d5 Store remaining text when parsing query 2024-07-21 15:19:19 -07:00
d9d57c66f8 Sort by title on date tie breaker 2024-07-21 15:18:31 -07:00
9746c9912b Implement newsreader counting 2024-07-21 15:13:09 -07:00
abaaddae3a Implement unread filtering on threads 2024-07-21 15:12:32 -07:00
0bf64004ff server: order tags alphabetically 2024-07-21 13:09:08 -07:00
6fae9cd018 WIP basic news thread rendering 2024-07-21 12:50:21 -07:00
65fcbd4b77 WIP move thread loading for notmuch into nm mod 2024-07-21 09:31:37 -07:00
dd09bc3168 WIP add search 2024-07-21 09:05:03 -07:00
0bf865fdef WIP reading news from app 2024-07-21 07:53:02 -07:00
5c0c45b99f Revert "Make blockquotes fancier"
This reverts commit 221f046664.
2024-07-13 15:21:59 -07:00
221f046664 Make blockquotes fancier 2024-07-13 09:19:52 -07:00
2a9d5b393e Use default styling for lists. 2024-07-13 09:02:35 -07:00
90860e5511 Remove profile from workspace config 2024-07-13 09:02:19 -07:00
0b1f806276 web: visualize blockquote better 2024-07-12 07:44:31 -07:00
0482713241 address cargo udeps 2024-07-07 15:06:04 -07:00
bb3e18519f cargo update 2024-07-07 14:59:10 -07:00
3a4d08facc web: lint 2024-07-07 14:43:58 -07:00
30064d5904 server: fix broken open-link-in-new-tab from recent changes 2024-07-07 14:40:37 -07:00
c288b7fd67 Disable running test 2024-07-06 18:47:55 -07:00
b4d1528612 web: migrate from lib->bin 2024-07-06 18:18:28 -07:00
5fc272054c Put all URLs under /api/ 2024-07-05 20:00:52 -07:00
714e73aeb1 Address a bunch of lint 2024-07-05 10:44:37 -07:00
3dfd2d48b3 Fix compile error 2024-07-05 10:40:14 -07:00
3a5a9bd66a Add support for inline images 2024-07-05 10:38:12 -07:00
55d7aec516 server: handle multipart/related with a multipart/alternative embedded 2024-05-05 19:03:38 -07:00
96d3e4a7d6 Version bump 2024-05-02 09:30:11 -07:00
beb96aba14 web: fix inverted boolean on spam shortcut 2024-04-29 21:04:56 -07:00
48f66c7096 web: when marking spam, also mark it as read 2024-04-14 08:17:36 -07:00
a96b553b08 Version bumps to get fixes to mailparse & data-encoding 2024-04-14 07:55:26 -07:00
31a3ac66b6 web: swap spam and read/unread buttons 2024-04-08 20:51:56 -07:00
a33e1f5d3c Update lock 2024-04-06 16:22:30 -07:00
423ea10d34 web: use upstream human_format 2024-04-06 16:20:15 -07:00
1b221d5c16 web&server: show raw body contents of UnhandledContentType 2024-04-06 10:21:31 -07:00
d4038f40d6 web: add UI to remove tags when viewing messages 2024-04-06 09:38:00 -07:00
dc7b3dd3e8 web: human format attachment size 2024-04-06 08:52:20 -07:00
1f5f10f78d server: properly filter inline vs attaments 2024-04-06 08:34:26 -07:00
7df11639ed web: don't show text on action icons on tablet/mobile 2024-04-06 08:10:04 -07:00
b0305b7411 web: separate spam button from read buttons and color red. 2024-04-06 08:00:35 -07:00
8abf9398e9 web: add mark as spam buttons 2024-04-03 21:10:23 -07:00
1b196a2703 server: add ability to add/remove labels 2024-04-03 21:07:06 -07:00
a24f456136 web: don't show mime type on attachment 2024-04-03 20:28:51 -07:00
d8fef54606 web: add attachment icons 2024-04-03 20:25:42 -07:00
9a5dc20f83 server: add functioning download attachment handler 2024-03-26 08:25:52 -07:00
ff1c3f5791 server: preserve class attribute on sanitized html 2024-03-26 08:25:37 -07:00
c74cd66826 server: add ability to view inline image attachments 2024-03-24 18:11:15 -07:00
c30cfec09d web: cleanup lint 2024-03-05 09:24:41 -08:00
e20e794508 web: remove mostly useless footer 2024-03-05 09:23:59 -08:00
d09efd3a69 web: overflow:auto the body so wide messages behave better 2024-03-05 09:20:31 -08:00
1ac7f5b6dc web: handle empty subjects 2024-03-05 09:04:19 -08:00
fc7a4a747c web: debug search for tag:letterbox instead of is:unread 2024-02-28 19:13:37 -08:00
facea2326e web: make from and date area clickable on search results page 2024-02-27 09:46:23 -08:00
56311bbe05 web: css cleanup for search results table 2024-02-27 09:07:49 -08:00
994631e872 web: display To/CC differently on expansion 2024-02-26 11:24:09 -08:00
43471d162f web: make empty subject line clickable 2024-02-26 11:01:20 -08:00
b997a61da8 web: better wrapping behavior for plain text messages 2024-02-24 09:14:50 -08:00
f69dd0b198 server: debug print unhandled mimetypes for some multipart messages 2024-02-23 16:55:13 -08:00
523584fbbc web: change style for attachments 2024-02-23 16:54:53 -08:00
4139ec38d8 web: add TODO about message and thread id types 2024-02-23 16:10:17 -08:00
5379ae09dc server: replace string literals in a bunch of places with consts 2024-02-23 16:09:58 -08:00
ebb16aef9e web: make mark read/unread icon target much larger 2024-02-23 07:07:20 -08:00
fc87fd702c web: refacter header rendering code, add more detail when message open 2024-02-22 21:19:09 -08:00
42484043a1 web: have colored initials for From
Add scaffolding for profile pics
2024-02-22 20:37:21 -08:00
3f268415e9 web: rework header in thread view, tweak some styles, remove some logging 2024-02-22 18:54:34 -08:00
c2a5fe19e3 web: go back to search page after changing read status 2024-02-21 17:58:12 -08:00
42ce88d931 web: add select all/partial/none for search table 2024-02-21 15:02:58 -08:00
cda99fc7a5 web: improve checkbox style on desktop 2024-02-20 20:20:50 -08:00
b33a252698 web: label read/unread icons 2024-02-20 20:16:25 -08:00
9e3ae22827 web: lint 2024-02-20 19:59:35 -08:00
5923547159 web: handle expand/collapse of messages separate from unread status 2024-02-20 19:58:50 -08:00
fe980c5468 web: lint 2024-02-20 19:25:28 -08:00
f50fe7196e web: add bulk read/unerad functionality 2024-02-20 19:24:56 -08:00
de3f392bd7 web: use bold text to indicate unread messages 2024-02-20 14:29:42 -08:00
02c0d36f90 web: remove a ton of legacy deprecated code 2024-02-20 14:13:06 -08:00
04592ddcc4 web: change up unread message styles 2024-02-20 13:55:54 -08:00
c8e0f68278 web: remove info statement 2024-02-16 19:24:16 -08:00
4957b485a0 web: add mark read button on search result page 2024-02-16 19:23:35 -08:00
7ebe517a34 web: tweak subject line style 2024-02-11 20:48:26 -08:00
516eedb086 web: add per-message unread control and display 2024-02-11 20:29:49 -08:00
ce836cd1e8 notmuch: add tag manipulation 2024-02-11 19:59:20 -08:00
f7010fa278 cargo update 2024-02-11 19:54:35 -08:00
5451dd2056 server: add mutation to mark messages as read 2024-02-11 19:43:34 -08:00
81ed3a8ca2 Linkify URLs missing schema 2024-02-07 19:41:34 -08:00
0f1a60a348 Sanitize html when linkifying plain text. 2024-02-03 11:15:57 -08:00
c59a883351 Address lint. 2024-02-03 11:14:43 -08:00
568d83f029 linkify URLs in plaintext emails. 2024-02-03 11:10:51 -08:00
569781b592 Tweak CSS for viewing body of messages 2024-01-20 08:34:25 -08:00
1b00c9e944 Updated cargo lock 2024-01-20 08:14:59 -08:00
901785e47c Change footer class to prevent conflict with email bodies. 2024-01-20 08:14:37 -08:00
8c47f01758 Improve server side html sanitization. 2024-01-20 08:14:10 -08:00
304819275d Open links in a new tab. 2024-01-19 21:07:24 -08:00
b1ea44963d Lint and cleanup empty file. 2024-01-17 12:31:56 -08:00
181965968c state: auto reload every 30 seconds 2024-01-17 12:31:37 -08:00
5b3eadb7bd Run tests before rebuilding app 2024-01-06 08:53:06 -08:00
28d484117b Change makefile to use variable for app name.
Make this more copypastable.
2024-01-06 08:52:42 -08:00
a0b0689e01 Fix wrapping/sizing of message bodies with long unbreakable text. 2024-01-06 08:52:19 -08:00
33ec63f097 web: update seed_hooks to my copy so I can pin to seed=0.10.0 2023-12-10 19:42:07 -08:00
7b22f85429 web: show union of tags when viewing thread 2023-12-10 17:26:24 -08:00
fa7df55b0e server: send tags on each message in thread 2023-12-10 17:26:04 -08:00
d2cf270dda web: properly truncate long headers on message view 2023-12-10 16:35:51 -08:00
f1b5e78962 web: make debug output hidden by default 2023-12-10 16:11:15 -08:00
fae4e43682 web: show thread count when greater than 1 2023-12-10 15:50:28 -08:00
37eb3d1dfd web: wrap content tree debug so messages aren't super wide 2023-12-07 10:24:39 -08:00
e0890f1181 web: search for unread tags when clicking under Unread section 2023-12-05 20:55:41 -08:00
c31f9d581f web: upgrade to seed-0.10.0 2023-12-05 20:46:59 -08:00
f2347345b4 Version bumps made css_inline uncompilable for wasm 2023-12-05 14:12:15 -08:00
e34f2a1f39 notmuch: fix tests 2023-12-05 12:50:52 -08:00
7a6000be26 server: address lint 2023-12-05 11:26:23 -08:00
dd1a8c2eae procmail2notmuch: WIP update script 2023-12-05 11:23:04 -08:00
42590b3cbc cargo update 2023-12-05 11:04:31 -08:00
94f7ad109a Merge commit 'f6bdf30' 2023-12-05 09:56:55 -08:00
f6bdf302fe server & notmuch: more attachment WIP, stop leaking notmuch processes 2023-12-03 14:01:18 -08:00
b76c535738 web: use log::error, not seed::error 2023-12-03 09:11:31 -08:00
29949c703d web: archive live site before pushing new one 2023-12-03 09:11:15 -08:00
f5f9eb175d server: WIP attachment serving 2023-12-03 09:11:00 -08:00
488c3b86f8 web: truncate raw messages and prep for attachments 2023-12-03 09:03:36 -08:00
be8fd59703 web: rename view_thread to take advantage of new namespaces 2023-12-03 08:49:20 -08:00
071fe2e206 web: show message-ID when viewing thread 2023-12-02 16:35:37 -08:00
ac5660a6d0 web: have trunk proxy /original/ requests to backend 2023-12-02 16:35:18 -08:00
99a104517d notmuch: comment typo 2023-12-02 16:35:05 -08:00
c3692cadec server: add id and header to ShowThreadQuery API 2023-12-02 16:34:44 -08:00
b14000952c server: make unread message counting much faster, remove rayon dep 2023-12-02 15:41:22 -08:00
7a32d5c630 server: include headers in debug output 2023-12-02 15:12:40 -08:00
714b057fdb web: add tablet rendering, listen to window resize events. 2023-12-02 10:56:14 -08:00
4c2526c70b web: remove unnecessary view_mobile_ prefix 2023-12-02 10:13:08 -08:00
a8f4aa03bd web: rename legacy functions to take advantage of mod namespacing 2023-12-02 10:11:56 -08:00
28d5562491 web: move legacy (pre-graphql) rendering to separate mod 2023-12-02 10:07:47 -08:00
e6f20e538a web: move mobile specific code to separate mod 2023-12-02 10:02:12 -08:00
970bb55c73 web: move desktop specific code into separate mod 2023-12-02 09:56:57 -08:00
12f0491455 web: remove stale comments 2023-12-02 09:34:57 -08:00
ef8362d6f2 web: remove some unused code 2023-12-02 09:34:26 -08:00
0a7cdefda3 web: refactor code into separate modules 2023-12-02 09:29:50 -08:00
cfe1446668 web: for tag list to be open when no unread messages 2023-11-29 09:27:53 -08:00
7c38962d21 web: make tag list hidable 2023-11-28 20:03:17 -08:00
7102f26c9e web: conditionally show unread section 2023-11-28 07:32:22 -08:00
71a3315fe8 web: lint and clean up search input handling 2023-11-27 21:11:12 -08:00
7cac81cddb web: update implement_email macro to handle repetition 2023-11-27 20:33:47 -08:00
3a5ca74d71 web: change tag list styling and show unread at the top 2023-11-27 19:48:19 -08:00
71af8179ec web: hierarchical tags list on desktop 2023-11-27 19:16:28 -08:00
d66a7d3d53 web: use singular version of view_address for From 2023-11-27 17:20:11 -08:00
e0fbb0253e web: create implement_email! macro 2023-11-27 17:16:57 -08:00
48466808d3 web & server: plumb debugging info for content type hierarchy.
Also cleanup Email trait.
2023-11-27 13:47:02 -08:00
87dfe4ace7 server: cleanup lint. 2023-11-26 21:31:06 -08:00
d45f223d52 server: fix pagination with small counts and no first/last set 2023-11-26 21:27:57 -08:00
e8c58bdbd0 server: handle multipart/mixed with an html or text subpart 2023-11-26 21:09:56 -08:00
87d687cde5 server: sanitize html using ammonia 2023-11-26 21:00:44 -08:00
c8147ded60 web & server: add handling for google calendar and wellsfargo emails. 2023-11-26 20:51:53 -08:00
1261bdf8a9 web & server: improved debug printing of unhandled mime types 2023-11-26 18:50:32 -08:00
11366b6fac web & server: implement handling for text and html bodies. 2023-11-26 16:37:29 -08:00
1cdabc348b web: better date formatting 2023-11-26 16:01:22 -08:00
02e16b4547 web: more compact output on desktop and mobile 2023-11-26 15:46:03 -08:00
d5a001bf03 web: refresh tags on thread view in addition to search results. 2023-11-26 15:31:51 -08:00
0ae72b63d0 web: add basic graphql view thread, no body support. 2023-11-26 15:27:19 -08:00
447a4a3387 server: basic graphql thread show, no body support yet. 2023-11-26 13:13:04 -08:00
0737f5aac5 web: rewrite frontend to use graphql for search results 2023-11-25 09:06:24 -08:00
3e3024dd5c server: handle search with no first/last better 2023-11-25 09:05:53 -08:00
24414b04bb server: fix backward pagination 2023-11-25 08:39:56 -08:00
f7df834325 notmuch: default empty search to wildcard 2023-11-25 08:39:30 -08:00
bce2c741c4 web: add non-functional graphql. 2023-11-21 14:06:48 -08:00
1b44bc57bb web: Initial commit of graphql schema and helper to update it. 2023-11-21 13:36:11 -08:00
ff6675b08f server: add unread field to tag query.
Optionally fill out unread, as it's expensive.
2023-11-21 13:17:11 -08:00
64912be4eb Hide quoted emails 2023-11-21 12:37:58 -08:00
57ccef18cb Make clicking search results on mobile easier. 2023-11-21 12:27:58 -08:00
2a24a20529 Revert stub show_pretty that will be obsoleted by graphql. 2023-11-21 08:35:35 -08:00
e6692059b4 Fix search pagination and add count RPC. 2023-11-20 21:18:40 -08:00
a7b172099b And graphql search with pagination. 2023-11-20 20:56:16 -08:00
f52a76dba3 Added graphql endpoint and tested with tags implementation. 2023-11-20 18:38:10 -08:00
43e4334890 Set default page size on server to match client side page size. 2023-11-20 17:57:07 -08:00
1d00bdb757 Squelch logging and remove unused variable. 2023-11-20 17:54:50 -08:00
6901c9fde9 Formate today and yesterday better. 2023-11-20 17:53:49 -08:00
6251c54873 Show time of email >1 week 2023-11-20 17:47:06 -08:00
f6c1835b18 Custom formatting of the age string, widen subject column. 2023-11-20 17:41:58 -08:00
95976c2860 Mobile style tweaks. 2023-11-20 15:49:30 -08:00
01589d7136 Add favicon 2023-11-20 15:40:07 -08:00
a2664473c8 Improve density on mobile. 2023-11-14 21:33:09 -08:00
da15ef0f15 Move main.rs to bin/ and stub some message stuff. 2023-11-06 18:41:12 -08:00
035508f3ad Better use of space on search table for desktop. 2023-11-05 08:36:14 -08:00
69558f15b4 Properly perform right data request on fresh page load. 2023-11-05 08:07:14 -08:00
a8f538eddf Show navbar at bottom of page too. 2023-09-24 13:13:26 -07:00
01e5ea14ab URL decode queries. 2023-09-05 09:49:59 -07:00
042d475c75 Style tweaks 2023-09-02 13:10:44 -07:00
dd0af52feb cargo update 2023-09-02 13:10:34 -07:00
130f9bbeba Use Trunk.toml for trunk config. 2023-09-02 13:10:17 -07:00
0a05b32a7a Remove 'TEST' text when viewing certain email types 2023-09-02 09:00:36 -07:00
c3f897c61a Fix pagination and default homepage to unread search. 2023-08-11 16:51:41 -07:00
c62bac037f Reload page on refresh 2023-08-11 14:06:47 -07:00
79a57f3082 Address workspace lint 2023-08-11 13:55:39 -07:00
c33de9d754 cargo update 2023-07-15 16:58:40 -07:00
105 changed files with 19639 additions and 2613 deletions

9
.cargo/config.toml Normal file
View File

@@ -0,0 +1,9 @@
[build]
rustflags = [ "--cfg=web_sys_unstable_apis" ]
[registry]
global-credential-providers = ["cargo:token"]
[registries.xinu]
index = "sparse+https://git.z.xinu.tv/api/packages/wathiede/cargo/"

10
.envrc Normal file
View File

@@ -0,0 +1,10 @@
source_up
export DATABASE_USER="newsreader";
export DATABASE_NAME="newsreader";
export DATABASE_HOST="nixos-07.h.xinu.tv";
export DATABASE_URL="postgres://${DATABASE_USER}@${DATABASE_HOST}/${DATABASE_NAME}";
export PROD_DATABASE_USER="newsreader";
export PROD_DATABASE_NAME="newsreader";
export PROD_DATABASE_HOST="postgres.h.xinu.tv";
export PROD_DATABASE_URL="postgres://${PROD_DATABASE_USER}@${PROD_DATABASE_HOST}/${PROD_DATABASE_NAME}";

67
.gitea/workflows/rust.yml Normal file
View File

@@ -0,0 +1,67 @@
on: [push]
name: Continuous integration
jobs:
check:
name: Check
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions-rust-lang/setup-rust-toolchain@v1
- run: cargo check
test:
name: Test Suite
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions-rust-lang/setup-rust-toolchain@v1
- run: cargo test
trunk:
name: Trunk
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions-rust-lang/setup-rust-toolchain@v1
with:
toolchain: stable
target: wasm32-unknown-unknown
- run: cargo install trunk
- run: cd web; trunk build
fmt:
name: Rustfmt
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions-rust-lang/setup-rust-toolchain@v1
with:
components: rustfmt
- name: Rustfmt Check
uses: actions-rust-lang/rustfmt@v1
build:
name: build
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions-rust-lang/setup-rust-toolchain@v1
- run: cargo build
udeps:
name: Disallow unused dependencies
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions-rust-lang/setup-rust-toolchain@v1
with:
toolchain: nightly
- name: Run cargo-udeps
uses: aig787/cargo-udeps-action@v1
with:
version: 'latest'
args: '--all-targets'

7025
Cargo.lock generated

File diff suppressed because it is too large Load Diff

View File

@@ -1,12 +1,18 @@
[workspace]
resolver = "2"
members = [
"web",
"server",
"notmuch",
"procmail2notmuch",
"shared"
]
default-members = ["server"]
members = ["web", "server", "notmuch", "procmail2notmuch", "shared"]
[workspace.package]
authors = ["Bill Thiede <git@xinu.tv>"]
edition = "2021"
license = "UNLICENSED"
publish = ["xinu"]
version = "0.15.11"
repository = "https://git.z.xinu.tv/wathiede/letterbox"
[profile.dev]
opt-level = 1
[profile.release]
lto = true

19
Justfile Normal file
View File

@@ -0,0 +1,19 @@
export CARGO_INCREMENTAL := "0"
export RUSTFLAGS := "-D warnings"
default:
@echo "Run: just patch|minor|major"
major: (_release "major")
minor: (_release "minor")
patch: (_release "patch")
sqlx-prepare:
cd server; cargo sqlx prepare && git add .sqlx; git commit -m "cargo sqlx prepare" .sqlx || true
pull:
git pull
_release level: pull sqlx-prepare
cargo-release release -x {{ level }} --workspace --no-confirm --registry=xinu

7
Makefile Normal file
View File

@@ -0,0 +1,7 @@
.PHONEY: release
release:
(cd server; cargo sqlx prepare && git add .sqlx; git commit -m "cargo sqlx prepare" .sqlx || true)
bash scripts/update-crate-version.sh
git push
all: release

4
dev.sh
View File

@@ -1,7 +1,7 @@
cd -- "$( dirname -- "${BASH_SOURCE[0]}" )"
tmux new-session -d -s letterbox-dev
tmux rename-window web
tmux send-keys "cd web; trunk serve --release --address 0.0.0.0 --port 6758 --proxy-backend http://localhost:9345/ --proxy-rewrite=/api/ -w ../shared -w ../notmuch -w ./" C-m
tmux send-keys "cd web; trunk serve -w ../.git -w ../shared -w ../notmuch -w ./" C-m
tmux new-window -n server
tmux send-keys "cd server; cargo watch -x run -w ../shared -w ../notmuch -w ./" C-m
tmux send-keys "cd server; cargo watch -c -w ../.git -w ../shared -w ../notmuch -w ./ -x 'run postgres://newsreader@nixos-07.h.xinu.tv/newsreader ../target/database/newsreader /tmp/letterbox/slurp'" C-m
tmux attach -d -t letterbox-dev

View File

@@ -1,17 +1,24 @@
[package]
name = "notmuch"
version = "0.1.0"
edition = "2021"
name = "letterbox-notmuch"
exclude = ["/testdata"]
description = "Wrapper for calling notmuch cli"
authors.workspace = true
edition.workspace = true
license.workspace = true
publish.workspace = true
repository.workspace = true
version.workspace = true
# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html
[dependencies]
log = "0.4.14"
mailparse = "0.16.0"
serde = { version = "1.0", features = ["derive"] }
serde_json = { version = "1.0", features = ["unbounded_depth"] }
thiserror = "1.0.30"
thiserror = "2.0.0"
tracing = "0.1.41"
[dev-dependencies]
itertools = "0.10.1"
itertools = "0.14.0"
pretty_assertions = "1"
rayon = "1.5"

View File

@@ -207,14 +207,16 @@
//! ```
use std::{
collections::HashMap,
ffi::OsStr,
io::{self, BufRead, BufReader, Lines},
io::{self},
path::{Path, PathBuf},
process::{Child, ChildStdout, Command, Stdio},
process::Command,
};
use log::info;
use log::{error, info};
use serde::{Deserialize, Serialize};
use tracing::instrument;
/// # Number of seconds since the Epoch
pub type UnixTime = isize;
@@ -269,6 +271,12 @@ pub struct Headers {
#[serde(skip_serializing_if = "Option::is_none")]
pub bcc: Option<String>,
#[serde(skip_serializing_if = "Option::is_none")]
#[serde(alias = "Delivered-To")]
pub delivered_to: Option<String>,
#[serde(skip_serializing_if = "Option::is_none")]
#[serde(alias = "X-Original-To")]
pub x_original_to: Option<String>,
#[serde(skip_serializing_if = "Option::is_none")]
pub reply_to: Option<String>,
pub date: String,
}
@@ -454,15 +462,21 @@ pub enum NotmuchError {
SerdeJson(#[from] serde_json::Error),
#[error("failed to parse bytes as str")]
Utf8Error(#[from] std::str::Utf8Error),
#[error("failed to parse bytes as String")]
StringUtf8Error(#[from] std::string::FromUtf8Error),
#[error("failed to parse str as int")]
ParseIntError(#[from] std::num::ParseIntError),
#[error("failed to parse mail: {0}")]
MailParseError(#[from] mailparse::MailParseError),
}
#[derive(Default)]
#[derive(Clone, Default)]
pub struct Notmuch {
config_path: Option<PathBuf>,
}
// TODO: rewrite to use tokio::process::Command and make everything async to see if that helps with
// concurrency being more parallel.
impl Notmuch {
pub fn with_config<P: AsRef<Path>>(config_path: P) -> Notmuch {
Notmuch {
@@ -470,6 +484,7 @@ impl Notmuch {
}
}
#[instrument(skip_all)]
pub fn new(&self) -> Result<Vec<u8>, NotmuchError> {
self.run_notmuch(["new"])
}
@@ -478,38 +493,75 @@ impl Notmuch {
self.run_notmuch(std::iter::empty::<&str>())
}
#[instrument(skip_all, fields(query=query))]
pub fn tags_for_query(&self, query: &str) -> Result<Vec<String>, NotmuchError> {
let res = self.run_notmuch(["search", "--format=json", "--output=tags", query])?;
Ok(serde_json::from_slice(&res)?)
}
pub fn tags(&self) -> Result<Vec<String>, NotmuchError> {
self.tags_for_query("*")
}
#[instrument(skip_all, fields(tag=tag,search_term=search_term))]
pub fn tag_add(&self, tag: &str, search_term: &str) -> Result<(), NotmuchError> {
self.run_notmuch(["tag", &format!("+{tag}"), search_term])?;
Ok(())
}
#[instrument(skip_all, fields(tag=tag,search_term=search_term))]
pub fn tag_remove(&self, tag: &str, search_term: &str) -> Result<(), NotmuchError> {
self.run_notmuch(["tag", &format!("-{tag}"), search_term])?;
Ok(())
}
#[instrument(skip_all, fields(query=query,offset=offset,limit=limit))]
pub fn search(
&self,
query: &str,
offset: usize,
limit: usize,
) -> Result<SearchSummary, NotmuchError> {
let res = self.run_notmuch([
"search",
"--format=json",
&format!("--offset={offset}"),
&format!("--limit={limit}"),
query,
])?;
Ok(serde_json::from_slice(&res)?)
let query = if query.is_empty() { "*" } else { query };
let res = self
.run_notmuch([
"search",
"--format=json",
&format!("--offset={offset}"),
&format!("--limit={limit}"),
query,
])
.inspect_err(|err| error!("failed to notmuch search for query '{query}': {err}"))?;
Ok(serde_json::from_slice(&res).unwrap_or_else(|err| {
error!("failed to decode search result for query '{query}': {err}");
SearchSummary(Vec::new())
}))
}
#[instrument(skip_all, fields(query=query))]
pub fn count(&self, query: &str) -> Result<usize, NotmuchError> {
// NOTE: --output=threads is technically more correct, but really slow
// TODO: find a fast thread count path
// let res = self.run_notmuch(["count", "--output=threads", query])?;
let res = self.run_notmuch(["count", query])?;
// Strip '\n' from res.
let s = std::str::from_utf8(&res[..res.len() - 1])?;
Ok(s.parse()?)
let s = std::str::from_utf8(&res)?.trim();
Ok(s.parse()
.inspect_err(|err| error!("failed to parse count for query '{query}': {err}"))
.unwrap_or(0))
}
#[instrument(skip_all, fields(query=query))]
pub fn show(&self, query: &str) -> Result<ThreadSet, NotmuchError> {
let slice = self.run_notmuch([
"show",
"--include-html=true",
"--entire-thread=true",
"--entire-thread=false",
"--format=json",
query,
])?;
// Notmuch returns JSON with invalid unicode. So we lossy convert it to a string here an
// Notmuch returns JSON with invalid unicode. So we lossy convert it to a string here and
// use that for parsing in rust.
let s = String::from_utf8_lossy(&slice);
let mut deserializer = serde_json::Deserializer::from_str(&s);
@@ -519,6 +571,7 @@ impl Notmuch {
Ok(val)
}
#[instrument(skip_all, fields(query=query,part=part))]
pub fn show_part(&self, query: &str, part: usize) -> Result<Part, NotmuchError> {
let slice = self.run_notmuch([
"show",
@@ -528,7 +581,7 @@ impl Notmuch {
&format!("--part={}", part),
query,
])?;
// Notmuch returns JSON with invalid unicode. So we lossy convert it to a string here an
// Notmuch returns JSON with invalid unicode. So we lossy convert it to a string here and
// use that for parsing in rust.
let s = String::from_utf8_lossy(&slice);
let mut deserializer = serde_json::Deserializer::from_str(&s);
@@ -538,21 +591,102 @@ impl Notmuch {
Ok(val)
}
#[instrument(skip_all, fields(id=id))]
pub fn show_original(&self, id: &MessageId) -> Result<Vec<u8>, NotmuchError> {
self.show_original_part(id, 0)
}
#[instrument(skip_all, fields(id=id,part=part))]
pub fn show_original_part(&self, id: &MessageId, part: usize) -> Result<Vec<u8>, NotmuchError> {
let res = self.run_notmuch(["show", "--part", &part.to_string(), id])?;
Ok(res)
}
pub fn message_ids(&self, query: &str) -> Result<Lines<BufReader<ChildStdout>>, NotmuchError> {
let mut child = self.run_notmuch_pipe(["search", "--output=messages", query])?;
Ok(BufReader::new(child.stdout.take().unwrap()).lines())
#[instrument(skip_all, fields(query=query))]
pub fn message_ids(&self, query: &str) -> Result<Vec<String>, NotmuchError> {
let res = self.run_notmuch(["search", "--output=messages", "--format=json", query])?;
Ok(serde_json::from_slice(&res)?)
}
// TODO(wathiede): implement tags() based on "notmuch search --output=tags '*'"
#[instrument(skip_all, fields(query=query))]
pub fn files(&self, query: &str) -> Result<Vec<String>, NotmuchError> {
let res = self.run_notmuch(["search", "--output=files", "--format=json", query])?;
Ok(serde_json::from_slice(&res)?)
}
#[instrument(skip_all)]
pub fn unread_recipients(&self) -> Result<HashMap<String, usize>, NotmuchError> {
let slice = self.run_notmuch([
"show",
"--include-html=false",
"--entire-thread=false",
"--body=false",
"--format=json",
// Arbitrary limit to prevent too much work
"--limit=1000",
"is:unread",
])?;
// Notmuch returns JSON with invalid unicode. So we lossy convert it to a string here and
// use that for parsing in rust.
let s = String::from_utf8_lossy(&slice);
let mut deserializer = serde_json::Deserializer::from_str(&s);
deserializer.disable_recursion_limit();
let ts: ThreadSet = serde::de::Deserialize::deserialize(&mut deserializer)?;
deserializer.end()?;
let mut r = HashMap::new();
fn collect_from_thread_node(
r: &mut HashMap<String, usize>,
tn: &ThreadNode,
) -> Result<(), NotmuchError> {
let Some(msg) = &tn.0 else {
return Ok(());
};
let mut addrs = vec![];
let hdr = &msg.headers.to;
if let Some(to) = hdr {
addrs.push(to);
} else {
let hdr = &msg.headers.x_original_to;
if let Some(to) = hdr {
addrs.push(to);
} else {
let hdr = &msg.headers.delivered_to;
if let Some(to) = hdr {
addrs.push(to);
};
};
};
let hdr = &msg.headers.cc;
if let Some(cc) = hdr {
addrs.push(cc);
};
for recipient in addrs {
mailparse::addrparse(&recipient)?
.into_inner()
.iter()
.for_each(|a| {
let mailparse::MailAddr::Single(si) = a else {
return;
};
let addr = &si.addr;
if addr == "couchmoney@gmail.com" || addr.ends_with("@xinu.tv") {
*r.entry(addr.to_lowercase()).or_default() += 1;
}
});
}
Ok(())
}
for t in ts.0 {
for tn in t.0 {
collect_from_thread_node(&mut r, &tn)?;
for sub_tn in tn.1 {
collect_from_thread_node(&mut r, &sub_tn)?;
}
}
}
Ok(r)
}
fn run_notmuch<I, S>(&self, args: I) -> Result<Vec<u8>, NotmuchError>
where
@@ -568,21 +702,6 @@ impl Notmuch {
let out = cmd.output()?;
Ok(out.stdout)
}
fn run_notmuch_pipe<I, S>(&self, args: I) -> Result<Child, NotmuchError>
where
I: IntoIterator<Item = S>,
S: AsRef<OsStr>,
{
let mut cmd = Command::new("notmuch");
if let Some(config_path) = &self.config_path {
cmd.arg("--config").arg(config_path);
}
cmd.args(args);
info!("{:?}", &cmd);
let child = cmd.stdout(Stdio::piped()).spawn()?;
Ok(child)
}
}
#[cfg(test)]

View File

@@ -1,11 +1,10 @@
use std::{
error::Error,
io::{stdout, Write},
time::{Duration, Instant},
time::Instant,
};
use itertools::Itertools;
use notmuch::{Notmuch, NotmuchError, SearchSummary, ThreadSet};
use letterbox_notmuch::Notmuch;
use rayon::iter::{ParallelBridge, ParallelIterator};
#[test]
@@ -23,11 +22,11 @@ fn parse_one() -> Result<(), Box<dyn Error>> {
let total = nm.count("*")? as f32;
let start = Instant::now();
nm.message_ids("*")?
.iter()
.enumerate()
.par_bridge()
.for_each(|(i, msg)| {
let msg = msg.expect("failed to unwrap msg");
let ts = nm
let _ts = nm
.show(&msg)
.expect(&format!("failed to show msg: {}", msg));
//println!("{:?}", ts);
@@ -77,11 +76,9 @@ fn parse_bulk() -> Result<(), Box<dyn Error>> {
.into_iter()
.enumerate()
//.par_bridge()
.for_each(|(i, chunk)| {
let msgs: Result<Vec<_>, _> = chunk.collect();
let msgs = msgs.expect("failed to unwrap msg");
.for_each(|(i, msgs)| {
let query = msgs.join(" OR ");
let ts = nm
let _ts = nm
.show(&query)
.expect(&format!("failed to show msgs: {}", query));
//println!("{:?}", ts);

View File

@@ -1,9 +1,18 @@
[package]
name = "procmail2notmuch"
version = "0.1.0"
edition = "2021"
name = "letterbox-procmail2notmuch"
description = "Tool for generating notmuch rules from procmail"
authors.workspace = true
edition.workspace = true
license.workspace = true
publish.workspace = true
repository.workspace = true
version.workspace = true
# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html
[dependencies]
anyhow = "1.0.69"
clap = { version = "4.5.37", features = ["derive", "env"] }
serde = { version = "1.0.219", features = ["derive"] }
sqlx = { version = "0.8.5", features = ["postgres", "runtime-tokio"] }
tokio = { version = "1.44.2", features = ["rt", "macros", "rt-multi-thread"] }

View File

@@ -1,13 +1,19 @@
use std::{convert::Infallible, io::Write, str::FromStr};
use std::{collections::HashMap, convert::Infallible, io::Write, str::FromStr};
#[derive(Debug, Default)]
use clap::{Parser, Subcommand};
use serde::{Deserialize, Serialize};
use sqlx::{types::Json, PgPool};
#[derive(
Copy, Clone, Debug, Default, PartialEq, Eq, Hash, Ord, PartialOrd, Serialize, Deserialize,
)]
enum MatchType {
From,
Sender,
To,
Cc,
Subject,
List,
ListId,
DeliveredTo,
XForwardedTo,
ReplyTo,
@@ -17,16 +23,17 @@ enum MatchType {
#[default]
Unknown,
}
#[derive(Debug, Default)]
#[derive(Debug, Default, Serialize, Deserialize)]
struct Match {
match_type: MatchType,
needle: String,
}
#[derive(Debug, Default)]
#[derive(Debug, Default, Serialize, Deserialize)]
struct Rule {
stop_on_match: bool,
matches: Vec<Match>,
tags: Vec<String>,
tag: Option<String>,
}
fn unescape(s: &str) -> String {
@@ -38,6 +45,10 @@ fn cleanup_match(prefix: &str, s: &str) -> String {
}
mod matches {
// From https://linux.die.net/man/5/procmailrc
// If the regular expression contains '^TO_' it will be substituted by '(^((Original-)?(Resent-)?(To|Cc|Bcc)|(X-Envelope |Apparently(-Resent)?)-To):(.*[^-a-zA-Z0-9_.])?)'
// If the regular expression contains '^TO' it will be substituted by '(^((Original-)?(Resent-)?(To|Cc|Bcc)|(X-Envelope |Apparently(-Resent)?)-To):(.*[^a-zA-Z])?)', which should catch all destination specifications containing a specific word.
pub const TO: &'static str = "TO";
pub const CC: &'static str = "Cc";
pub const TOCC: &'static str = "(TO|Cc)";
@@ -109,7 +120,7 @@ impl FromStr for Match {
});
} else if needle.starts_with(LIST_ID) {
return Ok(Match {
match_type: MatchType::List,
match_type: MatchType::ListId,
needle: cleanup_match(LIST_ID, needle),
});
} else if needle.starts_with(REPLY_TO) {
@@ -149,13 +160,109 @@ impl FromStr for Match {
}
}
#[derive(Debug, Subcommand)]
enum Mode {
Debug,
Notmuchrc,
LoadSql {
#[arg(short, long, default_value = env!("DATABASE_URL"))]
dsn: String,
},
}
/// Simple program to greet a person
#[derive(Parser, Debug)]
#[command(version, about, long_about = None)]
struct Args {
#[arg(short, long, default_value = "/home/wathiede/dotfiles/procmailrc")]
input: String,
#[command(subcommand)]
mode: Mode,
}
#[tokio::main]
async fn main() -> anyhow::Result<()> {
let args = Args::parse();
let mut rules = Vec::new();
let mut cur_rule = Rule::default();
for l in std::fs::read_to_string(args.input)?.lines() {
let l = if let Some(idx) = l.find('#') {
&l[..idx]
} else {
l
}
.trim();
if l.is_empty() {
continue;
}
if l.find('=').is_some() {
// Probably a variable assignment, skip line
continue;
}
let first = l.chars().nth(0).unwrap_or(' ');
match first {
':' => {
// start of rule
}
'*' => {
// add to current rule
let m: Match = l.parse()?;
cur_rule.matches.push(m);
}
'.' => {
// delivery to folder
cur_rule.tag = Some(cleanup_match(
"",
&l.replace('.', "/")
.replace(' ', "")
.trim_matches('/')
.to_string(),
));
rules.push(cur_rule);
cur_rule = Rule::default();
}
'/' => cur_rule = Rule::default(), // Ex. /dev/null
'|' => cur_rule = Rule::default(), // external command
'$' => {
// TODO(wathiede): tag messages with no other tag as 'inbox'
cur_rule.tag = Some(cleanup_match("", "inbox"));
rules.push(cur_rule);
cur_rule = Rule::default();
} // variable, should only be $DEFAULT in my config
_ => panic!("Unhandled first character '{}'\nLine: {}", first, l),
}
}
match args.mode {
Mode::Debug => print_rules(&rules),
Mode::Notmuchrc => notmuch_from_rules(std::io::stdout(), &rules)?,
Mode::LoadSql { dsn } => load_sql(&dsn, &rules).await?,
}
Ok(())
}
fn print_rules(rules: &[Rule]) {
let mut tally = HashMap::new();
for r in rules {
for m in &r.matches {
*tally.entry(m.match_type).or_insert(0) += 1;
}
}
let mut sorted: Vec<_> = tally.iter().map(|(k, v)| (v, k)).collect();
sorted.sort();
sorted.reverse();
for (v, k) in sorted {
println!("{k:?}: {v}");
}
}
fn notmuch_from_rules<W: Write>(mut w: W, rules: &[Rule]) -> anyhow::Result<()> {
// TODO(wathiede): if reindexing this many tags is too slow, see if combining rules per tag is
// faster.
let mut lines = Vec::new();
for r in rules {
for m in &r.matches {
for t in &r.tags {
if let Some(t) = &r.tag {
if let MatchType::Unknown = m.match_type {
eprintln!("rule has unknown match {:?}", r);
continue;
@@ -168,7 +275,7 @@ fn notmuch_from_rules<W: Write>(mut w: W, rules: &[Rule]) -> anyhow::Result<()>
MatchType::To => "to:",
MatchType::Cc => "to:",
MatchType::Subject => "subject:",
MatchType::List => "List-ID:",
MatchType::ListId => "List-ID:",
MatchType::Body => "",
// TODO(wathiede): these will probably require adding fields to notmuch
// index. Handle them later.
@@ -200,56 +307,25 @@ fn notmuch_from_rules<W: Write>(mut w: W, rules: &[Rule]) -> anyhow::Result<()>
Ok(())
}
fn main() -> anyhow::Result<()> {
let input = "/home/wathiede/dotfiles/procmailrc";
let mut rules = Vec::new();
let mut cur_rule = Rule::default();
for l in std::fs::read_to_string(input)?.lines() {
let l = if let Some(idx) = l.find('#') {
&l[..idx]
} else {
l
}
.trim();
if l.is_empty() {
continue;
}
if l.find('=').is_some() {
// Probably a variable assignment, skip line
continue;
}
let first = l.chars().nth(0).unwrap_or(' ');
match first {
':' => {
// start of rule
}
'*' => {
// add to current rule
let m: Match = l.parse()?;
cur_rule.matches.push(m);
}
'.' => {
// delivery to folder
cur_rule.tags.push(cleanup_match(
"",
&l.replace('.', "/")
.replace(' ', "")
.trim_matches('/')
.to_string(),
));
rules.push(cur_rule);
cur_rule = Rule::default();
}
'|' => cur_rule = Rule::default(), // external command
'$' => {
// TODO(wathiede): tag messages with no other tag as 'inbox'
cur_rule.tags.push(cleanup_match("", "inbox"));
rules.push(cur_rule);
cur_rule = Rule::default();
} // variable, should only be $DEFAULT in my config
_ => panic!("Unhandled first character '{}' {}", first, l),
}
async fn load_sql(dsn: &str, rules: &[Rule]) -> anyhow::Result<()> {
let pool = PgPool::connect(dsn).await?;
println!("clearing email_rule table");
sqlx::query!("DELETE FROM email_rule")
.execute(&pool)
.await?;
for (order, rule) in rules.iter().enumerate() {
println!("inserting {order}: {rule:?}");
sqlx::query!(
r#"
INSERT INTO email_rule (sort_order, rule)
VALUES ($1, $2)
"#,
order as i32,
Json(rule) as _
)
.execute(&pool)
.await?;
}
notmuch_from_rules(std::io::stdout(), &rules)?;
Ok(())
}

10
procmail2notmuch/update.sh Executable file
View File

@@ -0,0 +1,10 @@
set -e
cd ~/dotfiles
git diff
scp nasx:.procmailrc procmailrc
git diff
cd ~/src/xinu.tv/letterbox/procmail2notmuch
cargo run > /tmp/notmuch.tags
mv /tmp/notmuch.tags ~/dotfiles/notmuch.tags
cd ~/dotfiles
git diff

6
renovate.json Normal file
View File

@@ -0,0 +1,6 @@
{
"$schema": "https://docs.renovatebot.com/renovate-schema.json",
"extends": [
"config:recommended"
]
}

View File

@@ -0,0 +1,5 @@
#!env bash
set -e -x
cargo-set-version set-version --bump patch
VERSION="$(awk -F\" '/^version/ {print $2}' server/Cargo.toml)"
git commit Cargo.lock */Cargo.toml -m "Bumping version to ${VERSION:?}"

View File

@@ -0,0 +1,22 @@
{
"db_name": "PostgreSQL",
"query": "\nSELECT\n url\nFROM email_photo ep\nJOIN email_address ea\nON ep.id = ea.email_photo_id\nWHERE\n address = $1\n ",
"describe": {
"columns": [
{
"ordinal": 0,
"name": "url",
"type_info": "Text"
}
],
"parameters": {
"Left": [
"Text"
]
},
"nullable": [
false
]
},
"hash": "126e16a4675e8d79f330b235f9e1b8614ab1e1526e4e69691c5ebc70d54a42ef"
}

View File

@@ -0,0 +1,32 @@
{
"db_name": "PostgreSQL",
"query": "SELECT\n site,\n name,\n count (\n NOT is_read\n OR NULL\n ) unread\nFROM\n post AS p\n JOIN feed AS f ON p.site = f.slug --\n -- TODO: figure this out to make the query faster when only looking for unread\n --WHERE\n -- (\n -- NOT $1\n -- OR NOT is_read\n -- )\nGROUP BY\n 1,\n 2\nORDER BY\n site\n",
"describe": {
"columns": [
{
"ordinal": 0,
"name": "site",
"type_info": "Text"
},
{
"ordinal": 1,
"name": "name",
"type_info": "Text"
},
{
"ordinal": 2,
"name": "unread",
"type_info": "Int8"
}
],
"parameters": {
"Left": []
},
"nullable": [
true,
true,
null
]
},
"hash": "2dcbedef656e1b725c5ba4fb67d31ce7962d8714449b2fb630f49a7ed1acc270"
}

View File

@@ -0,0 +1,70 @@
{
"db_name": "PostgreSQL",
"query": "SELECT\n date,\n is_read,\n link,\n site,\n summary,\n clean_summary,\n title,\n name,\n homepage\nFROM\n post AS p\nINNER JOIN feed AS f ON p.site = f.slug\nWHERE\n uid = $1\n",
"describe": {
"columns": [
{
"ordinal": 0,
"name": "date",
"type_info": "Timestamp"
},
{
"ordinal": 1,
"name": "is_read",
"type_info": "Bool"
},
{
"ordinal": 2,
"name": "link",
"type_info": "Text"
},
{
"ordinal": 3,
"name": "site",
"type_info": "Text"
},
{
"ordinal": 4,
"name": "summary",
"type_info": "Text"
},
{
"ordinal": 5,
"name": "clean_summary",
"type_info": "Text"
},
{
"ordinal": 6,
"name": "title",
"type_info": "Text"
},
{
"ordinal": 7,
"name": "name",
"type_info": "Text"
},
{
"ordinal": 8,
"name": "homepage",
"type_info": "Text"
}
],
"parameters": {
"Left": [
"Text"
]
},
"nullable": [
true,
true,
false,
true,
true,
true,
true,
true,
true
]
},
"hash": "383221a94bc3746322ba78e41cde37994440ee67dc32e88d2394c51211bde6cd"
}

View File

@@ -0,0 +1,32 @@
{
"db_name": "PostgreSQL",
"query": "SELECT\n p.id,\n link,\n clean_summary\nFROM\n post AS p\nINNER JOIN feed AS f ON p.site = f.slug -- necessary to weed out nzb posts\nWHERE\n search_summary IS NULL\n -- TODO remove AND link ~ '^<'\nORDER BY\n ROW_NUMBER() OVER (PARTITION BY site ORDER BY date DESC)\nLIMIT 100;\n",
"describe": {
"columns": [
{
"ordinal": 0,
"name": "id",
"type_info": "Int4"
},
{
"ordinal": 1,
"name": "link",
"type_info": "Text"
},
{
"ordinal": 2,
"name": "clean_summary",
"type_info": "Text"
}
],
"parameters": {
"Left": []
},
"nullable": [
false,
false,
true
]
},
"hash": "3d271b404f06497a5dcde68cf6bf07291d70fa56058ea736ac24e91d33050c04"
}

View File

@@ -0,0 +1,24 @@
{
"db_name": "PostgreSQL",
"query": "SELECT COUNT(*) AS count\nFROM\n post\nWHERE\n (\n $1::text IS NULL\n OR site = $1\n )\n AND (\n NOT $2\n OR NOT is_read\n )\n AND (\n $3::text IS NULL\n OR TO_TSVECTOR('english', search_summary)\n @@ WEBSEARCH_TO_TSQUERY('english', $3)\n )\n",
"describe": {
"columns": [
{
"ordinal": 0,
"name": "count",
"type_info": "Int8"
}
],
"parameters": {
"Left": [
"Text",
"Bool",
"Text"
]
},
"nullable": [
null
]
},
"hash": "8c1b3c78649135e98b89092237750088433f7ff1b7c2ddeedec553406ea9f203"
}

View File

@@ -0,0 +1,15 @@
{
"db_name": "PostgreSQL",
"query": "UPDATE\n post\nSET\n is_read = $1\nWHERE\n uid = $2\n",
"describe": {
"columns": [],
"parameters": {
"Left": [
"Bool",
"Text"
]
},
"nullable": []
},
"hash": "b39147b9d06171cb742141eda4675688cb702fb284758b1224ed3aa2d7f3b3d9"
}

View File

@@ -0,0 +1,15 @@
{
"db_name": "PostgreSQL",
"query": "UPDATE post SET search_summary = $1 WHERE id = $2",
"describe": {
"columns": [],
"parameters": {
"Left": [
"Text",
"Int4"
]
},
"nullable": []
},
"hash": "ef8327f039dbfa8f4e59b7a77a6411252a346bf51cf940024a17d9fbb2df173c"
}

View File

@@ -0,0 +1,56 @@
{
"db_name": "PostgreSQL",
"query": "SELECT\n site,\n date,\n is_read,\n title,\n uid,\n name\nFROM\n post p\n JOIN feed f ON p.site = f.slug\nWHERE\n ($1::text IS NULL OR site = $1)\n AND (\n NOT $2\n OR NOT is_read\n )\n AND (\n $5 :: text IS NULL\n OR to_tsvector('english', search_summary) @@ websearch_to_tsquery('english', $5)\n )\nORDER BY\n date DESC,\n title OFFSET $3\nLIMIT\n $4\n",
"describe": {
"columns": [
{
"ordinal": 0,
"name": "site",
"type_info": "Text"
},
{
"ordinal": 1,
"name": "date",
"type_info": "Timestamp"
},
{
"ordinal": 2,
"name": "is_read",
"type_info": "Bool"
},
{
"ordinal": 3,
"name": "title",
"type_info": "Text"
},
{
"ordinal": 4,
"name": "uid",
"type_info": "Text"
},
{
"ordinal": 5,
"name": "name",
"type_info": "Text"
}
],
"parameters": {
"Left": [
"Text",
"Bool",
"Int8",
"Int8",
"Text"
]
},
"nullable": [
true,
true,
true,
true,
false,
true
]
},
"hash": "fc4607f02cc76a5f3a6629cce4507c74f52ae44820897b47365da3f339d1da06"
}

View File

@@ -1,23 +1,60 @@
[package]
name = "server"
version = "0.1.0"
edition = "2021"
name = "letterbox-server"
default-run = "letterbox-server"
description = "Backend for letterbox"
authors.workspace = true
edition.workspace = true
license.workspace = true
publish.workspace = true
repository.workspace = true
version.workspace = true
# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html
[dependencies]
rocket = { version = "0.5.0-rc.2", features = [ "json" ] }
rocket_cors = { git = "https://github.com/lawliet89/rocket_cors", branch = "master" }
notmuch = { path = "../notmuch" }
shared = { path = "../shared" }
serde_json = "1.0.87"
thiserror = "1.0.37"
serde = { version = "1.0.147", features = ["derive"] }
ammonia = "4.0.0"
anyhow = "1.0.79"
async-graphql = { version = "7", features = ["log"] }
async-graphql-axum = "7.0.15"
async-trait = "0.1.81"
axum = { version = "0.8.3", features = ["ws"] }
axum-macros = "0.5.0"
build-info = "0.0.40"
cacher = { version = "0.2.0", registry = "xinu" }
chrono = "0.4.39"
clap = { version = "4.5.36", features = ["derive"] }
css-inline = "0.14.0"
futures = "0.3.31"
headers = "0.4.0"
html-escape = "0.2.13"
letterbox-notmuch = { version = "0.15.11", path = "../notmuch", registry = "xinu" }
letterbox-shared = { version = "0.15.11", path = "../shared", registry = "xinu" }
linkify = "0.10.0"
log = "0.4.17"
lol_html = "2.0.0"
mailparse = "0.16.0"
maplit = "1.0.2"
memmap = "0.7.0"
regex = "1.11.1"
reqwest = { version = "0.12.7", features = ["blocking"] }
scraper = "0.23.0"
serde = { version = "1.0.147", features = ["derive"] }
serde_json = "1.0.87"
sqlx = { version = "0.8.2", features = ["postgres", "runtime-tokio", "time"] }
tantivy = { version = "0.24.0", optional = true }
thiserror = "2.0.0"
tokio = "1.26.0"
glog = "0.1.0"
tower-http = { version = "0.6.2", features = ["trace"] }
tracing = "0.1.41"
url = "2.5.2"
urlencoding = "2.1.3"
#xtracing = { git = "http://git-private.h.xinu.tv/wathiede/xtracing.git" }
#xtracing = { path = "../../xtracing" }
xtracing = { version = "0.3.0", registry = "xinu" }
[dependencies.rocket_contrib]
version = "0.4.11"
default-features = false
features = ["json"]
[build-dependencies]
build-info-build = "0.0.40"
[features]
#default = [ "tantivy" ]
tantivy = ["dep:tantivy"]

View File

@@ -1,9 +1,13 @@
[release]
address = "0.0.0.0"
port = 9345
newsreader_database_url = "postgres://newsreader@nixos-07.h.xinu.tv/newsreader"
newsreader_tantivy_db_path = "../target/database/newsreader"
[debug]
address = "0.0.0.0"
port = 9345
# Uncomment to make it production like.
#log_level = "critical"
newsreader_database_url = "postgres://newsreader@nixos-07.h.xinu.tv/newsreader"
newsreader_tantivy_db_path = "../target/database/newsreader"
slurp_cache_path = "/tmp/letterbox/slurp"

5
server/build.rs Normal file
View File

@@ -0,0 +1,5 @@
fn main() {
// Calling `build_info_build::build_script` collects all data and makes it available to `build_info::build_info!`
// and `build_info::format!` in the main program.
build_info_build::build_script();
}

View File

@@ -0,0 +1,3 @@
DROP INDEX IF EXISTS post_summary_idx;
DROP INDEX IF EXISTS post_site_idx;
DROP INDEX IF EXISTS post_title_idx;

View File

@@ -0,0 +1,3 @@
CREATE INDEX post_summary_idx ON post USING GIN (to_tsvector('english', summary));
CREATE INDEX post_site_idx ON post USING GIN (to_tsvector('english', site));
CREATE INDEX post_title_idx ON post USING GIN (to_tsvector('english', title));

View File

@@ -0,0 +1,24 @@
BEGIN;
ALTER TABLE IF EXISTS public."Email" DROP CONSTRAINT IF EXISTS email_avatar_fkey;
ALTER TABLE IF EXISTS public."EmailDisplayName" DROP CONSTRAINT IF EXISTS email_id_fk;
ALTER TABLE IF EXISTS public."Message" DROP CONSTRAINT IF EXISTS message_to_fkey;
ALTER TABLE IF EXISTS public."Message" DROP CONSTRAINT IF EXISTS message_cc_fkey;
ALTER TABLE IF EXISTS public."Message" DROP CONSTRAINT IF EXISTS message_from_fkey;
ALTER TABLE IF EXISTS public."Message" DROP CONSTRAINT IF EXISTS message_header_fkey;
ALTER TABLE IF EXISTS public."Message" DROP CONSTRAINT IF EXISTS message_file_fkey;
ALTER TABLE IF EXISTS public."Message" DROP CONSTRAINT IF EXISTS message_body_id_fkey;
ALTER TABLE IF EXISTS public."Message" DROP CONSTRAINT IF EXISTS message_thread_fkey;
ALTER TABLE IF EXISTS public."Message" DROP CONSTRAINT IF EXISTS message_tag_fkey;
DROP TABLE IF EXISTS public."Email";
DROP TABLE IF EXISTS public."EmailDisplayName";
DROP TABLE IF EXISTS public."Message";
DROP TABLE IF EXISTS public."Header";
DROP TABLE IF EXISTS public."File";
DROP TABLE IF EXISTS public."Avatar";
DROP TABLE IF EXISTS public."Body";
DROP TABLE IF EXISTS public."Thread";
DROP TABLE IF EXISTS public."Tag";
END;

View File

@@ -0,0 +1,174 @@
-- This script was generated by the ERD tool in pgAdmin 4.
-- Please log an issue at https://github.com/pgadmin-org/pgadmin4/issues/new/choose if you find any bugs, including reproduction steps.
BEGIN;
ALTER TABLE IF EXISTS public."Email" DROP CONSTRAINT IF EXISTS email_avatar_fkey;
ALTER TABLE IF EXISTS public."EmailDisplayName" DROP CONSTRAINT IF EXISTS email_id_fk;
ALTER TABLE IF EXISTS public."Message" DROP CONSTRAINT IF EXISTS message_to_fkey;
ALTER TABLE IF EXISTS public."Message" DROP CONSTRAINT IF EXISTS message_cc_fkey;
ALTER TABLE IF EXISTS public."Message" DROP CONSTRAINT IF EXISTS message_from_fkey;
ALTER TABLE IF EXISTS public."Message" DROP CONSTRAINT IF EXISTS message_header_fkey;
ALTER TABLE IF EXISTS public."Message" DROP CONSTRAINT IF EXISTS message_file_fkey;
ALTER TABLE IF EXISTS public."Message" DROP CONSTRAINT IF EXISTS message_body_id_fkey;
ALTER TABLE IF EXISTS public."Message" DROP CONSTRAINT IF EXISTS message_thread_fkey;
ALTER TABLE IF EXISTS public."Message" DROP CONSTRAINT IF EXISTS message_tag_fkey;
CREATE TABLE IF NOT EXISTS public."Email"
(
id integer NOT NULL GENERATED ALWAYS AS IDENTITY,
address text NOT NULL,
avatar_id integer,
PRIMARY KEY (id),
CONSTRAINT avatar_id UNIQUE (avatar_id)
);
CREATE TABLE IF NOT EXISTS public."EmailDisplayName"
(
id integer NOT NULL GENERATED ALWAYS AS IDENTITY,
email_id integer NOT NULL,
PRIMARY KEY (id)
);
CREATE TABLE IF NOT EXISTS public."Message"
(
id integer NOT NULL GENERATED ALWAYS AS IDENTITY,
subject text,
"from" integer,
"to" integer,
cc integer,
header_id integer,
hash text NOT NULL,
file_id integer NOT NULL,
date timestamp with time zone NOT NULL,
unread boolean NOT NULL,
body_id integer NOT NULL,
thread_id integer NOT NULL,
tag_id integer,
CONSTRAINT body_id UNIQUE (body_id)
);
CREATE TABLE IF NOT EXISTS public."Header"
(
id integer NOT NULL GENERATED ALWAYS AS IDENTITY,
key text NOT NULL,
value text NOT NULL,
PRIMARY KEY (id)
);
CREATE TABLE IF NOT EXISTS public."File"
(
id integer NOT NULL GENERATED ALWAYS AS IDENTITY,
path text NOT NULL,
PRIMARY KEY (id)
);
CREATE TABLE IF NOT EXISTS public."Avatar"
(
id integer NOT NULL GENERATED ALWAYS AS IDENTITY,
url text NOT NULL,
PRIMARY KEY (id)
);
CREATE TABLE IF NOT EXISTS public."Body"
(
id integer NOT NULL GENERATED ALWAYS AS IDENTITY,
text text NOT NULL,
PRIMARY KEY (id)
);
CREATE TABLE IF NOT EXISTS public."Thread"
(
id integer NOT NULL GENERATED ALWAYS AS IDENTITY,
PRIMARY KEY (id)
);
CREATE TABLE IF NOT EXISTS public."Tag"
(
id integer NOT NULL GENERATED ALWAYS AS IDENTITY,
name text NOT NULL,
display text,
fg_color integer,
bg_color integer,
PRIMARY KEY (id)
);
ALTER TABLE IF EXISTS public."Email"
ADD CONSTRAINT email_avatar_fkey FOREIGN KEY (avatar_id)
REFERENCES public."Avatar" (id) MATCH SIMPLE
ON UPDATE NO ACTION
ON DELETE NO ACTION
NOT VALID;
ALTER TABLE IF EXISTS public."EmailDisplayName"
ADD CONSTRAINT email_id_fk FOREIGN KEY (email_id)
REFERENCES public."Email" (id) MATCH SIMPLE
ON UPDATE NO ACTION
ON DELETE NO ACTION
NOT VALID;
ALTER TABLE IF EXISTS public."Message"
ADD CONSTRAINT message_to_fkey FOREIGN KEY ("to")
REFERENCES public."Email" (id) MATCH SIMPLE
ON UPDATE NO ACTION
ON DELETE NO ACTION
NOT VALID;
ALTER TABLE IF EXISTS public."Message"
ADD CONSTRAINT message_cc_fkey FOREIGN KEY (cc)
REFERENCES public."Email" (id) MATCH SIMPLE
ON UPDATE NO ACTION
ON DELETE NO ACTION
NOT VALID;
ALTER TABLE IF EXISTS public."Message"
ADD CONSTRAINT message_from_fkey FOREIGN KEY ("from")
REFERENCES public."Email" (id) MATCH SIMPLE
ON UPDATE NO ACTION
ON DELETE NO ACTION
NOT VALID;
ALTER TABLE IF EXISTS public."Message"
ADD CONSTRAINT message_header_fkey FOREIGN KEY (header_id)
REFERENCES public."Header" (id) MATCH SIMPLE
ON UPDATE NO ACTION
ON DELETE NO ACTION
NOT VALID;
ALTER TABLE IF EXISTS public."Message"
ADD CONSTRAINT message_file_fkey FOREIGN KEY (file_id)
REFERENCES public."File" (id) MATCH SIMPLE
ON UPDATE NO ACTION
ON DELETE NO ACTION
NOT VALID;
ALTER TABLE IF EXISTS public."Message"
ADD CONSTRAINT message_body_id_fkey FOREIGN KEY (body_id)
REFERENCES public."Body" (id) MATCH SIMPLE
ON UPDATE NO ACTION
ON DELETE NO ACTION
NOT VALID;
ALTER TABLE IF EXISTS public."Message"
ADD CONSTRAINT message_thread_fkey FOREIGN KEY (thread_id)
REFERENCES public."Thread" (id) MATCH SIMPLE
ON UPDATE NO ACTION
ON DELETE NO ACTION
NOT VALID;
ALTER TABLE IF EXISTS public."Message"
ADD CONSTRAINT message_tag_fkey FOREIGN KEY (tag_id)
REFERENCES public."Tag" (id) MATCH SIMPLE
ON UPDATE NO ACTION
ON DELETE NO ACTION
NOT VALID;
END;

View File

@@ -0,0 +1,3 @@
-- Add down migration script here
ALTER TABLE
post DROP CONSTRAINT post_link_key;

View File

@@ -0,0 +1,28 @@
WITH dupes AS (
SELECT
uid,
link,
Row_number() over(
PARTITION by link
ORDER BY
link
) AS RowNumber
FROM
post
)
DELETE FROM
post
WHERE
uid IN (
SELECT
uid
FROM
dupes
WHERE
RowNumber > 1
);
ALTER TABLE
post
ADD
UNIQUE (link);

View File

@@ -0,0 +1,7 @@
ALTER TABLE
post
ALTER COLUMN
link DROP NOT NULL;
ALTER TABLE
post DROP CONSTRAINT link;

View File

@@ -0,0 +1,17 @@
DELETE FROM
post
WHERE
link IS NULL
OR link = '';
ALTER TABLE
post
ALTER COLUMN
link
SET
NOT NULL;
ALTER TABLE
post
ADD
CONSTRAINT link CHECK (link <> '');

View File

@@ -0,0 +1,3 @@
DROP TABLE IF EXISTS email_address;
DROP TABLE IF EXISTS photo;
DROP TABLE IF EXISTS google_person;

View File

@@ -0,0 +1,19 @@
-- Add up migration script here
CREATE TABLE IF NOT EXISTS google_person (
id SERIAL PRIMARY KEY,
resource_name TEXT NOT NULL UNIQUE,
display_name TEXT NOT NULL
);
CREATE TABLE IF NOT EXISTS email_photo (
id SERIAL PRIMARY KEY,
google_person_id INTEGER REFERENCES google_person (id) UNIQUE,
url TEXT NOT NULL
);
CREATE TABLE IF NOT EXISTS email_address (
id SERIAL PRIMARY KEY,
address TEXT NOT NULL UNIQUE,
email_photo_id INTEGER REFERENCES email_photo (id),
google_person_id INTEGER REFERENCES google_person (id)
);

View File

@@ -0,0 +1,5 @@
-- Add down migration script here
DROP INDEX post_summary_idx;
CREATE INDEX post_summary_idx ON post USING gin (
to_tsvector('english', summary)
);

View File

@@ -0,0 +1,11 @@
-- Something like this around summary in the idx w/ tsvector
DROP INDEX post_summary_idx;
CREATE INDEX post_summary_idx ON post USING gin (to_tsvector(
'english',
regexp_replace(
regexp_replace(summary, '<[^>]+>', ' ', 'g'),
'\s+',
' ',
'g'
)
));

View File

@@ -0,0 +1,2 @@
-- Add down migration script here
DROP INDEX nzb_posts_created_at_idx;

View File

@@ -0,0 +1,2 @@
-- Add up migration script here
CREATE INDEX nzb_posts_created_at_idx ON nzb_posts USING btree (created_at);

View File

@@ -0,0 +1,15 @@
-- Add down migration script here
BEGIN;
DROP INDEX IF EXISTS post_search_summary_idx;
ALTER TABLE post DROP search_summary;
-- CREATE INDEX post_summary_idx ON post USING gin (to_tsvector(
-- 'english',
-- regexp_replace(
-- regexp_replace(summary, '<[^>]+>', ' ', 'g'),
-- '\s+',
-- ' ',
-- 'g'
-- )
-- ));
COMMIT;

View File

@@ -0,0 +1,14 @@
-- Add up migration script here
BEGIN;
DROP INDEX IF EXISTS post_summary_idx;
ALTER TABLE post ADD search_summary TEXT;
CREATE INDEX post_search_summary_idx ON post USING gin (
to_tsvector('english', search_summary)
);
UPDATE post SET search_summary = regexp_replace(
regexp_replace(summary, '<[^>]+>', ' ', 'g'),
'\s+',
' ',
'g'
);
COMMIT;

View File

@@ -0,0 +1,20 @@
-- Bad examples:
-- https://nzbfinder.ws/getnzb/d2c3e5a08abadd985dccc6a574122892030b6a9a.nzb&i=95972&r=b55082d289937c050dedc203c9653850
-- https://nzbfinder.ws/getnzb?id=45add174-7da4-4445-bf2b-a67dbbfc07fe.nzb&r=b55082d289937c050dedc203c9653850
-- https://nzbfinder.ws/api/v1/getnzb?id=82486020-c192-4fa0-a7e7-798d7d72e973.nzb&r=b55082d289937c050dedc203c9653850
UPDATE nzb_posts
SET link =
regexp_replace(
regexp_replace(
regexp_replace(
link,
'https://nzbfinder.ws/getnzb/',
'https://nzbfinder.ws/api/v1/getnzb?id='
),
'https://nzbfinder.ws/getnzb',
'https://nzbfinder.ws/api/v1/getnzb'
),
'&r=',
'&apikey='
)
;

View File

@@ -0,0 +1,3 @@
DROP TABLE IF NOT EXISTS email_rule;
-- Add down migration script here

View File

@@ -0,0 +1,5 @@
CREATE TABLE IF NOT EXISTS email_rule (
id integer NOT NULL GENERATED ALWAYS AS IDENTITY,
sort_order integer NOT NULL,
rule jsonb NOT NULL
);

14
server/sql/all-posts.sql Normal file
View File

@@ -0,0 +1,14 @@
SELECT
site,
title,
summary,
link,
date,
is_read,
uid,
p.id id
FROM
post AS p
JOIN feed AS f ON p.site = f.slug -- necessary to weed out nzb posts
ORDER BY
date DESC;

6
server/sql/all-uids.sql Normal file
View File

@@ -0,0 +1,6 @@
SELECT
uid
FROM
post AS p
JOIN feed AS f ON p.site = f.slug -- necessary to weed out nzb posts
;

17
server/sql/count.sql Normal file
View File

@@ -0,0 +1,17 @@
SELECT COUNT(*) AS count
FROM
post
WHERE
(
$1::text IS NULL
OR site = $1
)
AND (
NOT $2
OR NOT is_read
)
AND (
$3::text IS NULL
OR TO_TSVECTOR('english', search_summary)
@@ WEBSEARCH_TO_TSQUERY('english', $3)
)

View File

@@ -0,0 +1,13 @@
SELECT
p.id,
link,
clean_summary
FROM
post AS p
INNER JOIN feed AS f ON p.site = f.slug -- necessary to weed out nzb posts
WHERE
search_summary IS NULL
-- TODO remove AND link ~ '^<'
ORDER BY
ROW_NUMBER() OVER (PARTITION BY site ORDER BY date DESC)
LIMIT 100;

View File

@@ -0,0 +1,14 @@
SELECT
site AS "site!",
title AS "title!",
summary AS "summary!",
link AS "link!",
date AS "date!",
is_read AS "is_read!",
uid AS "uid!",
p.id id
FROM
post p
JOIN feed f ON p.site = f.slug
WHERE
uid = ANY ($1);

View File

@@ -0,0 +1,6 @@
UPDATE
post
SET
is_read = $1
WHERE
uid = $2

21
server/sql/tags.sql Normal file
View File

@@ -0,0 +1,21 @@
SELECT
site,
name,
count (
NOT is_read
OR NULL
) unread
FROM
post AS p
JOIN feed AS f ON p.site = f.slug --
-- TODO: figure this out to make the query faster when only looking for unread
--WHERE
-- (
-- NOT $1
-- OR NOT is_read
-- )
GROUP BY
1,
2
ORDER BY
site

15
server/sql/thread.sql Normal file
View File

@@ -0,0 +1,15 @@
SELECT
date,
is_read,
link,
site,
summary,
clean_summary,
title,
name,
homepage
FROM
post AS p
INNER JOIN feed AS f ON p.site = f.slug
WHERE
uid = $1

View File

@@ -0,0 +1,14 @@
SELECT
site,
date,
is_read,
title,
uid,
name
FROM
post p
JOIN feed f ON p.site = f.slug
WHERE
uid = ANY ($1)
ORDER BY
date DESC;

25
server/sql/threads.sql Normal file
View File

@@ -0,0 +1,25 @@
SELECT
site,
date,
is_read,
title,
uid,
name
FROM
post p
JOIN feed f ON p.site = f.slug
WHERE
($1::text IS NULL OR site = $1)
AND (
NOT $2
OR NOT is_read
)
AND (
$5 :: text IS NULL
OR to_tsvector('english', search_summary) @@ websearch_to_tsquery('english', $5)
)
ORDER BY
date DESC,
title OFFSET $3
LIMIT
$4

View File

@@ -0,0 +1,13 @@
select t.id, tt.tokid, tt.alias, length(t.token), t.token from (
select id, (ts_parse('default',
-- regexp_replace(
-- regexp_replace(summary, '<[^>]+>', ' ', 'g'),
-- '\s+',
-- ' ',
-- 'g'
-- )
summary
)).* from post) t
inner join ts_token_type('default') tt
on t.tokid = tt.tokid
where length(token) >= 2*1024;

View File

@@ -0,0 +1,16 @@
use std::fs;
use letterbox_server::sanitize_html;
fn main() -> anyhow::Result<()> {
let mut args = std::env::args().skip(1);
let src = args.next().expect("source not specified");
let dst = args.next().expect("destination not specified");
println!("Sanitizing {src} into {dst}");
let bytes = fs::read(src)?;
let html = String::from_utf8_lossy(&bytes);
let html = sanitize_html(&html, "", &None)?;
fs::write(dst, html)?;
Ok(())
}

View File

@@ -0,0 +1,21 @@
use std::fs;
use url::Url;
fn main() -> anyhow::Result<()> {
println!("PWD: {}", std::env::current_dir()?.display());
let _url = "https://slashdot.org/story/25/01/24/1813201/walgreens-replaced-fridge-doors-with-smart-screens-its-now-a-200-million-fiasco?utm_source=rss1.0mainlinkanon&utm_medium=feed";
let _url = "https://hackaday.com/2025/01/24/hackaday-podcast-episode-305-caustic-clocks-practice-bones-and-brick-layers/";
let _url = "https://theonion.com/monster-devastated-to-see-film-depicting-things-he-told-guillermo-del-toro-in-confidence/";
let _url = "https://trofi.github.io/posts/330-another-nix-language-nondeterminism-example.html";
let _url = "https://blog.cloudflare.com/ddos-threat-report-for-2024-q4/";
let url = "https://trofi.github.io/posts/330-another-nix-language-nondeterminism-example.html";
let body = reqwest::blocking::get(url)?.text()?;
let output = "/tmp/h2md/output.html";
let inliner = css_inline::CSSInliner::options()
.base_url(Url::parse(url).ok())
.build();
let inlined = inliner.inline(&body)?;
fs::write(output, inlined)?;
Ok(())
}

View File

@@ -0,0 +1,316 @@
// Rocket generates a lot of warnings for handlers
// TODO: figure out why
#![allow(unreachable_patterns)]
use std::{error::Error, net::SocketAddr, sync::Arc, time::Duration};
use async_graphql::{extensions, http::GraphiQLSource, Schema};
use async_graphql_axum::{GraphQL, GraphQLSubscription};
//allows to extract the IP of connecting user
use axum::extract::connect_info::ConnectInfo;
use axum::{
extract::{self, ws::WebSocketUpgrade, Query, State},
http::{header, StatusCode},
response::{self, IntoResponse, Response},
routing::{any, get, post},
Router,
};
use cacher::FilesystemCacher;
use clap::Parser;
use letterbox_notmuch::Notmuch;
#[cfg(feature = "tantivy")]
use letterbox_server::tantivy::TantivyConnection;
use letterbox_server::{
graphql::{compute_catchup_ids, Attachment, MutationRoot, QueryRoot, SubscriptionRoot},
nm::{attachment_bytes, cid_attachment_bytes},
ws::ConnectionTracker,
};
use letterbox_shared::WebsocketMessage;
use serde::Deserialize;
use sqlx::postgres::PgPool;
use tokio::{net::TcpListener, sync::Mutex};
use tower_http::trace::{DefaultMakeSpan, TraceLayer};
use tracing::{info, warn};
// Make our own error that wraps `anyhow::Error`.
struct AppError(letterbox_server::ServerError);
// Tell axum how to convert `AppError` into a response.
impl IntoResponse for AppError {
fn into_response(self) -> Response {
(
StatusCode::INTERNAL_SERVER_ERROR,
format!("Something went wrong: {}", self.0),
)
.into_response()
}
}
// This enables using `?` on functions that return `Result<_, letterbox_server::Error>` to turn them into
// `Result<_, AppError>`. That way you don't need to do that manually.
impl<E> From<E> for AppError
where
E: Into<letterbox_server::ServerError>,
{
fn from(err: E) -> Self {
Self(err.into())
}
}
fn inline_attachment_response(attachment: Attachment) -> impl IntoResponse {
info!("attachment filename {:?}", attachment.filename);
let mut hdr_map = headers::HeaderMap::new();
if let Some(filename) = attachment.filename {
hdr_map.insert(
header::CONTENT_DISPOSITION,
format!(r#"inline; filename="{}""#, filename)
.parse()
.unwrap(),
);
}
if let Some(ct) = attachment.content_type {
hdr_map.insert(header::CONTENT_TYPE, ct.parse().unwrap());
}
info!("hdr_map {hdr_map:?}");
(hdr_map, attachment.bytes).into_response()
}
fn download_attachment_response(attachment: Attachment) -> impl IntoResponse {
info!("attachment filename {:?}", attachment.filename);
let mut hdr_map = headers::HeaderMap::new();
if let Some(filename) = attachment.filename {
hdr_map.insert(
header::CONTENT_DISPOSITION,
format!(r#"attachment; filename="{}""#, filename)
.parse()
.unwrap(),
);
}
if let Some(ct) = attachment.content_type {
hdr_map.insert(header::CONTENT_TYPE, ct.parse().unwrap());
}
info!("hdr_map {hdr_map:?}");
(hdr_map, attachment.bytes).into_response()
}
#[axum_macros::debug_handler]
async fn view_attachment(
State(AppState { nm, .. }): State<AppState>,
extract::Path((id, idx, _)): extract::Path<(String, String, String)>,
) -> Result<impl IntoResponse, AppError> {
let mid = if id.starts_with("id:") {
id.to_string()
} else {
format!("id:{}", id)
};
info!("view attachment {mid} {idx}");
let idx: Vec<_> = idx
.split('.')
.map(|s| s.parse().expect("not a usize"))
.collect();
let attachment = attachment_bytes(&nm, &mid, &idx)?;
Ok(inline_attachment_response(attachment))
}
async fn download_attachment(
State(AppState { nm, .. }): State<AppState>,
extract::Path((id, idx, _)): extract::Path<(String, String, String)>,
) -> Result<impl IntoResponse, AppError> {
let mid = if id.starts_with("id:") {
id.to_string()
} else {
format!("id:{}", id)
};
info!("download attachment {mid} {idx}");
let idx: Vec<_> = idx
.split('.')
.map(|s| s.parse().expect("not a usize"))
.collect();
let attachment = attachment_bytes(&nm, &mid, &idx)?;
Ok(download_attachment_response(attachment))
}
async fn view_cid(
State(AppState { nm, .. }): State<AppState>,
extract::Path((id, cid)): extract::Path<(String, String)>,
) -> Result<impl IntoResponse, AppError> {
let mid = if id.starts_with("id:") {
id.to_string()
} else {
format!("id:{}", id)
};
info!("view cid attachment {mid} {cid}");
let attachment = cid_attachment_bytes(&nm, &mid, &cid)?;
Ok(inline_attachment_response(attachment))
}
// TODO make this work with gitea message ids like `wathiede/letterbox/pulls/91@git.z.xinu.tv`
async fn view_original(
State(AppState { nm, .. }): State<AppState>,
extract::Path(id): extract::Path<String>,
) -> Result<impl IntoResponse, AppError> {
info!("view_original {id}");
let mid = if id.starts_with("id:") {
id.to_string()
} else {
format!("id:{}", id)
};
let files = nm.files(&mid)?;
let Some(path) = files.first() else {
warn!("failed to find files for message {mid}");
return Ok((StatusCode::NOT_FOUND, mid).into_response());
};
let str = std::fs::read_to_string(&path)?;
Ok(str.into_response())
}
async fn graphiql() -> impl IntoResponse {
response::Html(
GraphiQLSource::build()
.endpoint("/api/graphql/")
.subscription_endpoint("/api/graphql/ws")
.finish(),
)
}
async fn start_ws(
ws: WebSocketUpgrade,
ConnectInfo(addr): ConnectInfo<SocketAddr>,
State(AppState {
connection_tracker, ..
}): State<AppState>,
) -> impl IntoResponse {
info!("intiating websocket connection for {addr}");
ws.on_upgrade(async move |socket| connection_tracker.lock().await.add_peer(socket, addr).await)
}
#[derive(Debug, Deserialize)]
struct NotificationParams {
delay_ms: Option<u64>,
}
async fn send_refresh_websocket_handler(
State(AppState {
connection_tracker, ..
}): State<AppState>,
params: Query<NotificationParams>,
) -> impl IntoResponse {
info!("send_refresh_websocket_handler params {params:?}");
if let Some(delay_ms) = params.delay_ms {
let delay = Duration::from_millis(delay_ms);
info!("sleeping {delay:?}");
tokio::time::sleep(delay).await;
}
connection_tracker
.lock()
.await
.send_message_all(WebsocketMessage::RefreshMessages)
.await;
"refresh triggered"
}
async fn watch_new(
nm: Notmuch,
pool: PgPool,
conn_tracker: Arc<Mutex<ConnectionTracker>>,
poll_time: Duration,
) -> Result<(), async_graphql::Error> {
let mut old_ids = Vec::new();
loop {
let ids = compute_catchup_ids(&nm, &pool, "is:unread").await?;
if old_ids != ids {
info!("old_ids: {old_ids:?}\n ids: {ids:?}");
conn_tracker
.lock()
.await
.send_message_all(WebsocketMessage::RefreshMessages)
.await
}
old_ids = ids;
tokio::time::sleep(poll_time).await;
}
}
#[derive(Clone)]
struct AppState {
nm: Notmuch,
connection_tracker: Arc<Mutex<ConnectionTracker>>,
}
#[derive(Parser)]
#[command(version, about, long_about = None)]
struct Cli {
#[arg(short, long, default_value = "0.0.0.0:9345")]
addr: SocketAddr,
newsreader_database_url: String,
newsreader_tantivy_db_path: String,
slurp_cache_path: String,
}
#[tokio::main]
async fn main() -> Result<(), Box<dyn Error>> {
let cli = Cli::parse();
let _guard = xtracing::init(env!("CARGO_BIN_NAME"))?;
build_info::build_info!(fn bi);
info!("Build Info: {}", letterbox_shared::build_version(bi));
if !std::fs::exists(&cli.slurp_cache_path)? {
info!("Creating slurp cache @ '{}'", &cli.slurp_cache_path);
std::fs::create_dir_all(&cli.slurp_cache_path)?;
}
let pool = PgPool::connect(&cli.newsreader_database_url).await?;
let nm = Notmuch::default();
sqlx::migrate!("./migrations").run(&pool).await?;
#[cfg(feature = "tantivy")]
let tantivy_conn = TantivyConnection::new(&cli.newsreader_tantivy_db_path)?;
let cacher = FilesystemCacher::new(&cli.slurp_cache_path)?;
let schema = Schema::build(QueryRoot, MutationRoot, SubscriptionRoot)
.data(nm.clone())
.data(cacher)
.data(pool.clone());
let schema = schema.extension(extensions::Logger).finish();
let connection_tracker = Arc::new(Mutex::new(ConnectionTracker::default()));
let ct = Arc::clone(&connection_tracker);
let poll_time = Duration::from_secs(60);
let _h = tokio::spawn(watch_new(nm.clone(), pool, ct, poll_time));
let api_routes = Router::new()
.route(
"/download/attachment/{id}/{idx}/{*rest}",
get(download_attachment),
)
.route("/view/attachment/{id}/{idx}/{*rest}", get(view_attachment))
.route("/original/{id}", get(view_original))
.route("/cid/{id}/{cid}", get(view_cid))
.route("/ws", any(start_ws))
.route_service("/graphql/ws", GraphQLSubscription::new(schema.clone()))
.route(
"/graphql/",
get(graphiql).post_service(GraphQL::new(schema.clone())),
);
let notification_routes = Router::new()
.route("/mail", post(send_refresh_websocket_handler))
.route("/news", post(send_refresh_websocket_handler));
let app = Router::new()
.nest("/api", api_routes)
.nest("/notification", notification_routes)
.with_state(AppState {
nm,
connection_tracker,
})
.layer(
TraceLayer::new_for_http()
.make_span_with(DefaultMakeSpan::default().include_headers(true)),
);
let listener = TcpListener::bind(cli.addr).await.unwrap();
tracing::info!("listening on {}", listener.local_addr().unwrap());
axum::serve(
listener,
app.into_make_service_with_connect_info::<SocketAddr>(),
)
.await
.unwrap();
Ok(())
}

File diff suppressed because it is too large Load Diff

7
server/src/config.rs Normal file
View File

@@ -0,0 +1,7 @@
use serde::Deserialize;
#[derive(Deserialize)]
pub struct Config {
pub newsreader_database_url: String,
pub newsreader_tantivy_db_path: String,
pub slurp_cache_path: String,
}

42
server/src/error.rs Normal file
View File

@@ -0,0 +1,42 @@
use std::{convert::Infallible, str::Utf8Error, string::FromUtf8Error};
use mailparse::MailParseError;
#[cfg(feature = "tantivy")]
use tantivy::{query::QueryParserError, TantivyError};
use thiserror::Error;
use crate::TransformError;
#[derive(Error, Debug)]
pub enum ServerError {
#[error("notmuch: {0}")]
NotmuchError(#[from] letterbox_notmuch::NotmuchError),
#[error("flatten")]
FlattenError,
#[error("mail parse error: {0}")]
MailParseError(#[from] MailParseError),
#[error("IO error: {0}")]
IoError(#[from] std::io::Error),
#[error("attachement not found")]
PartNotFound,
#[error("sqlx error: {0}")]
SQLXError(#[from] sqlx::Error),
#[error("html transform error: {0}")]
TransformError(#[from] TransformError),
#[error("UTF8 error: {0}")]
Utf8Error(#[from] Utf8Error),
#[error("FromUTF8 error: {0}")]
FromUtf8Error(#[from] FromUtf8Error),
#[error("error: {0}")]
StringError(String),
#[error("invalid url: {0}")]
UrlParseError(#[from] url::ParseError),
#[cfg(feature = "tantivy")]
#[error("tantivy error: {0}")]
TantivyError(#[from] TantivyError),
#[cfg(feature = "tantivy")]
#[error("tantivy query parse error: {0}")]
QueryParseError(#[from] QueryParserError),
#[error("impossible: {0}")]
InfaillibleError(#[from] Infallible),
}

689
server/src/graphql.rs Normal file
View File

@@ -0,0 +1,689 @@
use std::{fmt, str::FromStr};
use async_graphql::{
connection::{self, Connection, Edge, OpaqueCursor},
futures_util::Stream,
Context, Enum, Error, FieldResult, InputObject, Object, Schema, SimpleObject, Subscription,
Union,
};
use cacher::FilesystemCacher;
use futures::stream;
use letterbox_notmuch::Notmuch;
use log::info;
use serde::{Deserialize, Serialize};
use sqlx::postgres::PgPool;
use tokio::join;
use tracing::instrument;
#[cfg(feature = "tantivy")]
use crate::tantivy::TantivyConnection;
use crate::{newsreader, nm, Query};
/// # Number of seconds since the Epoch
pub type UnixTime = isize;
/// # Thread ID, sans "thread:"
pub type ThreadId = String;
#[derive(Debug, Enum, Copy, Clone, Eq, PartialEq)]
pub enum Corpus {
Notmuch,
Newsreader,
Tantivy,
}
impl FromStr for Corpus {
type Err = String;
fn from_str(s: &str) -> Result<Self, Self::Err> {
Ok(match s {
"notmuch" => Corpus::Notmuch,
"newsreader" => Corpus::Newsreader,
"tantivy" => Corpus::Tantivy,
s => return Err(format!("unknown corpus: '{s}'")),
})
}
}
// TODO: add is_read field and remove all use of 'tag:unread'
#[derive(Debug, SimpleObject)]
pub struct ThreadSummary {
pub thread: ThreadId,
pub timestamp: UnixTime,
/// user-friendly timestamp
pub date_relative: String,
/// number of matched messages
pub matched: isize,
/// total messages in thread
pub total: isize,
/// comma-separated names with | between matched and unmatched
pub authors: String,
pub subject: String,
pub tags: Vec<String>,
pub corpus: Corpus,
}
#[derive(Debug, Union)]
pub enum Thread {
Email(EmailThread),
News(NewsPost),
}
#[derive(Debug, SimpleObject)]
pub struct NewsPost {
pub thread_id: String,
pub is_read: bool,
pub slug: String,
pub site: String,
pub title: String,
pub body: String,
pub url: String,
pub timestamp: i64,
}
#[derive(Debug, SimpleObject)]
pub struct EmailThread {
pub thread_id: String,
pub subject: String,
pub messages: Vec<Message>,
}
#[derive(Debug, SimpleObject)]
pub struct Message {
// Message-ID for message, prepend `id:<id>` to search in notmuch
pub id: String,
// First From header found in email
pub from: Option<Email>,
// All To headers found in email
pub to: Vec<Email>,
// All CC headers found in email
pub cc: Vec<Email>,
// X-Original-To header found in email
pub x_original_to: Option<Email>,
// Delivered-To header found in email
pub delivered_to: Option<Email>,
// First Subject header found in email
pub subject: Option<String>,
// Parsed Date header, if found and valid
pub timestamp: Option<i64>,
// Headers
pub headers: Vec<Header>,
// The body contents
pub body: Body,
// On disk location of message
pub path: String,
pub attachments: Vec<Attachment>,
pub tags: Vec<String>,
}
// Content-Type: image/jpeg; name="PXL_20231125_204826860.jpg"
// Content-Disposition: attachment; filename="PXL_20231125_204826860.jpg"
// Content-Transfer-Encoding: base64
// Content-ID: <f_lponoluo1>
// X-Attachment-Id: f_lponoluo1
#[derive(Default, Debug, SimpleObject)]
pub struct Attachment {
pub id: String,
pub idx: String,
pub filename: Option<String>,
pub size: usize,
pub content_type: Option<String>,
pub content_id: Option<String>,
pub disposition: DispositionType,
pub bytes: Vec<u8>,
}
#[derive(Debug, Clone, Eq, PartialEq)]
pub struct Disposition {
pub r#type: DispositionType,
pub filename: Option<String>,
pub size: Option<usize>,
}
#[derive(Debug, Enum, Copy, Clone, Eq, PartialEq)]
pub enum DispositionType {
Inline,
Attachment,
}
impl From<mailparse::DispositionType> for DispositionType {
fn from(value: mailparse::DispositionType) -> Self {
match value {
mailparse::DispositionType::Inline => DispositionType::Inline,
mailparse::DispositionType::Attachment => DispositionType::Attachment,
dt => panic!("unhandled DispositionType {dt:?}"),
}
}
}
impl Default for DispositionType {
fn default() -> Self {
DispositionType::Attachment
}
}
#[derive(Debug, SimpleObject)]
pub struct Header {
pub key: String,
pub value: String,
}
#[derive(Debug)]
pub struct UnhandledContentType {
pub text: String,
pub content_tree: String,
}
#[Object]
impl UnhandledContentType {
async fn contents(&self) -> &str {
&self.text
}
async fn content_tree(&self) -> &str {
&self.content_tree
}
}
#[derive(Debug)]
pub struct PlainText {
pub text: String,
pub content_tree: String,
}
#[Object]
impl PlainText {
async fn contents(&self) -> &str {
&self.text
}
async fn content_tree(&self) -> &str {
&self.content_tree
}
}
#[derive(Debug)]
pub struct Html {
pub html: String,
pub content_tree: String,
}
#[Object]
impl Html {
async fn contents(&self) -> &str {
&self.html
}
async fn content_tree(&self) -> &str {
&self.content_tree
}
async fn headers(&self) -> Vec<Header> {
Vec::new()
}
}
#[derive(Debug, Union)]
pub enum Body {
UnhandledContentType(UnhandledContentType),
PlainText(PlainText),
Html(Html),
}
impl Body {
pub fn html(html: String) -> Body {
Body::Html(Html {
html,
content_tree: "".to_string(),
})
}
pub fn text(text: String) -> Body {
Body::PlainText(PlainText {
text,
content_tree: "".to_string(),
})
}
}
#[derive(Debug, SimpleObject)]
pub struct Email {
pub name: Option<String>,
pub addr: Option<String>,
pub photo_url: Option<String>,
}
impl fmt::Display for Email {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> Result<(), std::fmt::Error> {
match (&self.name, &self.addr) {
(Some(name), Some(addr)) => write!(f, "{name} <{addr}>")?,
(Some(name), None) => write!(f, "{name}")?,
(None, Some(addr)) => write!(f, "{addr}")?,
(None, None) => write!(f, "<UNKNOWN>")?,
}
Ok(())
}
}
#[derive(SimpleObject)]
pub struct Tag {
pub name: String,
pub fg_color: String,
pub bg_color: String,
pub unread: usize,
}
#[derive(Serialize, Deserialize, Debug, InputObject)]
struct SearchCursor {
newsreader_offset: i32,
notmuch_offset: i32,
#[cfg(feature = "tantivy")]
tantivy_offset: i32,
}
fn request_id() -> String {
let now = std::time::SystemTime::now();
let nanos = now
.duration_since(std::time::SystemTime::UNIX_EPOCH)
.unwrap_or_default()
.as_nanos();
format!("{nanos:x}")
}
pub struct QueryRoot;
#[Object]
impl QueryRoot {
async fn version<'ctx>(&self, _ctx: &Context<'ctx>) -> Result<String, Error> {
build_info::build_info!(fn bi);
Ok(letterbox_shared::build_version(bi))
}
#[instrument(skip_all, fields(query=query, rid=request_id()))]
async fn count<'ctx>(&self, ctx: &Context<'ctx>, query: String) -> Result<usize, Error> {
let nm = ctx.data_unchecked::<Notmuch>();
let pool = ctx.data_unchecked::<PgPool>();
#[cfg(feature = "tantivy")]
let tantivy = ctx.data_unchecked::<TantivyConnection>();
let newsreader_query: Query = query.parse()?;
let newsreader_count = newsreader::count(pool, &newsreader_query).await?;
let notmuch_count = nm::count(nm, &newsreader_query).await?;
#[cfg(feature = "tantivy")]
let tantivy_count = tantivy.count(&newsreader_query).await?;
#[cfg(not(feature = "tantivy"))]
let tantivy_count = 0;
let total = newsreader_count + notmuch_count + tantivy_count;
info!("count {newsreader_query:?} newsreader count {newsreader_count} notmuch count {notmuch_count} tantivy count {tantivy_count} total {total}");
Ok(total)
}
#[instrument(skip_all, fields(query=query, rid=request_id()))]
async fn catchup<'ctx>(
&self,
ctx: &Context<'ctx>,
query: String,
) -> Result<Vec<String>, Error> {
let nm = ctx.data_unchecked::<Notmuch>();
let pool = ctx.data_unchecked::<PgPool>();
compute_catchup_ids(nm, pool, &query).await
}
// TODO: this function doesn't get parallelism, possibly because notmuch is sync and blocks,
// rewrite that with tokio::process:Command
#[instrument(skip_all, fields(query=query, rid=request_id()))]
async fn search<'ctx>(
&self,
ctx: &Context<'ctx>,
after: Option<String>,
before: Option<String>,
first: Option<i32>,
last: Option<i32>,
query: String,
) -> Result<Connection<OpaqueCursor<SearchCursor>, ThreadSummary>, Error> {
info!("search({after:?} {before:?} {first:?} {last:?} {query:?})",);
let nm = ctx.data_unchecked::<Notmuch>();
let pool = ctx.data_unchecked::<PgPool>();
#[cfg(feature = "tantivy")]
let tantivy = ctx.data_unchecked::<TantivyConnection>();
Ok(connection::query(
after,
before,
first,
last,
|after: Option<OpaqueCursor<SearchCursor>>,
before: Option<OpaqueCursor<SearchCursor>>,
first: Option<usize>,
last: Option<usize>| async move {
info!(
"search(after {:?} before {:?} first {first:?} last {last:?} query: {query:?})",
after.as_ref().map(|v| &v.0),
before.as_ref().map(|v| &v.0)
);
let newsreader_after = after.as_ref().map(|sc| sc.newsreader_offset);
let notmuch_after = after.as_ref().map(|sc| sc.notmuch_offset);
#[cfg(feature = "tantivy")]
let tantivy_after = after.as_ref().map(|sc| sc.tantivy_offset);
let newsreader_before = before.as_ref().map(|sc| sc.newsreader_offset);
let notmuch_before = before.as_ref().map(|sc| sc.notmuch_offset);
#[cfg(feature = "tantivy")]
let tantivy_before = before.as_ref().map(|sc| sc.tantivy_offset);
let first = first.map(|v| v as i32);
let last = last.map(|v| v as i32);
let query: Query = query.parse()?;
info!("newsreader_query {query:?}");
let newsreader_fut = newsreader_search(
pool,
newsreader_after,
newsreader_before,
first,
last,
&query,
);
let notmuch_fut =
notmuch_search(nm, notmuch_after, notmuch_before, first, last, &query);
#[cfg(feature = "tantivy")]
let tantivy_fut = tantivy_search(
tantivy,
pool,
tantivy_after,
tantivy_before,
first,
last,
&query,
);
#[cfg(not(feature = "tantivy"))]
let tantivy_fut =
async { Ok::<Vec<ThreadSummaryCursor>, async_graphql::Error>(Vec::new()) };
let (newsreader_results, notmuch_results, tantivy_results) =
join!(newsreader_fut, notmuch_fut, tantivy_fut);
let newsreader_results = newsreader_results?;
let notmuch_results = notmuch_results?;
let tantivy_results = tantivy_results?;
info!(
"newsreader_results ({}) notmuch_results ({}) tantivy_results ({})",
newsreader_results.len(),
notmuch_results.len(),
tantivy_results.len()
);
let mut results: Vec<_> = newsreader_results
.into_iter()
.chain(notmuch_results)
.chain(tantivy_results)
.collect();
// The leading '-' is to reverse sort
results.sort_by_key(|item| match item {
ThreadSummaryCursor::Newsreader(_, ts) => -ts.timestamp,
ThreadSummaryCursor::Notmuch(_, ts) => -ts.timestamp,
#[cfg(feature = "tantivy")]
ThreadSummaryCursor::Tantivy(_, ts) => -ts.timestamp,
});
let mut has_next_page = before.is_some();
if let Some(first) = first {
let first = first as usize;
if results.len() > first {
has_next_page = true;
results.truncate(first);
}
}
let mut has_previous_page = after.is_some();
if let Some(last) = last {
let last = last as usize;
if results.len() > last {
has_previous_page = true;
results.truncate(last);
}
}
let mut connection = Connection::new(has_previous_page, has_next_page);
// Set starting offset as the value from cursor to preserve state if no results from a corpus survived the truncation
let mut newsreader_offset =
after.as_ref().map(|sc| sc.newsreader_offset).unwrap_or(0);
let mut notmuch_offset = after.as_ref().map(|sc| sc.notmuch_offset).unwrap_or(0);
#[cfg(feature = "tantivy")]
let tantivy_offset = after.as_ref().map(|sc| sc.tantivy_offset).unwrap_or(0);
info!(
"newsreader_offset ({}) notmuch_offset ({})",
newsreader_offset, notmuch_offset,
);
connection.edges.extend(results.into_iter().map(|item| {
let thread_summary;
match item {
ThreadSummaryCursor::Newsreader(offset, ts) => {
thread_summary = ts;
newsreader_offset = offset;
}
ThreadSummaryCursor::Notmuch(offset, ts) => {
thread_summary = ts;
notmuch_offset = offset;
}
#[cfg(feature = "tantivy")]
ThreadSummaryCursor::Tantivy(offset, ts) => {
thread_summary = ts;
tantivy_offset = offset;
}
}
let cur = OpaqueCursor(SearchCursor {
newsreader_offset,
notmuch_offset,
#[cfg(feature = "tantivy")]
tantivy_offset,
});
Edge::new(cur, thread_summary)
}));
Ok::<_, async_graphql::Error>(connection)
},
)
.await?)
}
#[instrument(skip_all, fields(rid=request_id()))]
async fn tags<'ctx>(&self, ctx: &Context<'ctx>) -> FieldResult<Vec<Tag>> {
let nm = ctx.data_unchecked::<Notmuch>();
let pool = ctx.data_unchecked::<PgPool>();
let needs_unread = ctx.look_ahead().field("unread").exists();
let mut tags = newsreader::tags(pool, needs_unread).await?;
tags.append(&mut nm::tags(nm, needs_unread)?);
Ok(tags)
}
#[instrument(skip_all, fields(thread_id=thread_id, rid=request_id()))]
async fn thread<'ctx>(&self, ctx: &Context<'ctx>, thread_id: String) -> Result<Thread, Error> {
let nm = ctx.data_unchecked::<Notmuch>();
let cacher = ctx.data_unchecked::<FilesystemCacher>();
let pool = ctx.data_unchecked::<PgPool>();
let debug_content_tree = ctx
.look_ahead()
.field("messages")
.field("body")
.field("contentTree")
.exists();
if newsreader::is_newsreader_thread(&thread_id) {
Ok(newsreader::thread(cacher, pool, thread_id).await?)
} else {
Ok(nm::thread(nm, pool, thread_id, debug_content_tree).await?)
}
}
}
#[derive(Debug)]
enum ThreadSummaryCursor {
Newsreader(i32, ThreadSummary),
Notmuch(i32, ThreadSummary),
#[cfg(feature = "tantivy")]
Tantivy(i32, ThreadSummary),
}
async fn newsreader_search(
pool: &PgPool,
after: Option<i32>,
before: Option<i32>,
first: Option<i32>,
last: Option<i32>,
query: &Query,
) -> Result<Vec<ThreadSummaryCursor>, async_graphql::Error> {
Ok(newsreader::search(pool, after, before, first, last, &query)
.await?
.into_iter()
.map(|(cur, ts)| ThreadSummaryCursor::Newsreader(cur, ts))
.collect())
}
async fn notmuch_search(
nm: &Notmuch,
after: Option<i32>,
before: Option<i32>,
first: Option<i32>,
last: Option<i32>,
query: &Query,
) -> Result<Vec<ThreadSummaryCursor>, async_graphql::Error> {
Ok(nm::search(nm, after, before, first, last, &query)
.await?
.into_iter()
.map(|(cur, ts)| ThreadSummaryCursor::Notmuch(cur, ts))
.collect())
}
#[cfg(feature = "tantivy")]
async fn tantivy_search(
tantivy: &TantivyConnection,
pool: &PgPool,
after: Option<i32>,
before: Option<i32>,
first: Option<i32>,
last: Option<i32>,
query: &Query,
) -> Result<Vec<ThreadSummaryCursor>, async_graphql::Error> {
Ok(tantivy
.search(pool, after, before, first, last, &query)
.await?
.into_iter()
.map(|(cur, ts)| ThreadSummaryCursor::Tantivy(cur, ts))
.collect())
}
pub struct MutationRoot;
#[Object]
impl MutationRoot {
#[instrument(skip_all, fields(query=query, unread=unread, rid=request_id()))]
async fn set_read_status<'ctx>(
&self,
ctx: &Context<'ctx>,
query: String,
unread: bool,
) -> Result<bool, Error> {
let nm = ctx.data_unchecked::<Notmuch>();
let pool = ctx.data_unchecked::<PgPool>();
#[cfg(feature = "tantivy")]
let tantivy = ctx.data_unchecked::<TantivyConnection>();
let query: Query = query.parse()?;
newsreader::set_read_status(pool, &query, unread).await?;
#[cfg(feature = "tantivy")]
tantivy.reindex_thread(pool, &query).await?;
nm::set_read_status(nm, &query, unread).await?;
Ok(true)
}
#[instrument(skip_all, fields(query=query, tag=tag, rid=request_id()))]
async fn tag_add<'ctx>(
&self,
ctx: &Context<'ctx>,
query: String,
tag: String,
) -> Result<bool, Error> {
let nm = ctx.data_unchecked::<Notmuch>();
info!("tag_add({tag}, {query})");
nm.tag_add(&tag, &query)?;
Ok(true)
}
#[instrument(skip_all, fields(query=query, tag=tag, rid=request_id()))]
async fn tag_remove<'ctx>(
&self,
ctx: &Context<'ctx>,
query: String,
tag: String,
) -> Result<bool, Error> {
let nm = ctx.data_unchecked::<Notmuch>();
info!("tag_remove({tag}, {query})");
nm.tag_remove(&tag, &query)?;
Ok(true)
}
/// Drop and recreate tantivy index. Warning this is slow
#[cfg(feature = "tantivy")]
async fn drop_and_load_index<'ctx>(&self, ctx: &Context<'ctx>) -> Result<bool, Error> {
let tantivy = ctx.data_unchecked::<TantivyConnection>();
let pool = ctx.data_unchecked::<PgPool>();
tantivy.drop_and_load_index()?;
tantivy.reindex_all(pool).await?;
Ok(true)
}
#[instrument(skip_all, fields(rid=request_id()))]
async fn refresh<'ctx>(&self, ctx: &Context<'ctx>) -> Result<bool, Error> {
let nm = ctx.data_unchecked::<Notmuch>();
let cacher = ctx.data_unchecked::<FilesystemCacher>();
let pool = ctx.data_unchecked::<PgPool>();
info!("{}", String::from_utf8_lossy(&nm.new()?));
newsreader::refresh(pool, cacher).await?;
#[cfg(feature = "tantivy")]
{
let tantivy = ctx.data_unchecked::<TantivyConnection>();
// TODO: parallelize
tantivy.refresh(pool).await?;
}
Ok(true)
}
}
pub struct SubscriptionRoot;
#[Subscription]
impl SubscriptionRoot {
async fn values(&self, _ctx: &Context<'_>) -> Result<impl Stream<Item = usize>, Error> {
Ok(stream::iter(0..10))
}
}
pub type GraphqlSchema = Schema<QueryRoot, MutationRoot, SubscriptionRoot>;
#[instrument(skip_all, fields(query=query))]
pub async fn compute_catchup_ids(
nm: &Notmuch,
pool: &PgPool,
query: &str,
) -> Result<Vec<String>, Error> {
let query: Query = query.parse()?;
// TODO: implement optimized versions of fetching just IDs
let newsreader_fut = newsreader_search(pool, None, None, None, None, &query);
let notmuch_fut = notmuch_search(nm, None, None, None, None, &query);
let (newsreader_results, notmuch_results) = join!(newsreader_fut, notmuch_fut);
let newsreader_results = newsreader_results?;
let notmuch_results = notmuch_results?;
info!(
"newsreader_results ({}) notmuch_results ({})",
newsreader_results.len(),
notmuch_results.len(),
);
let mut results: Vec<_> = newsreader_results
.into_iter()
.chain(notmuch_results)
.collect();
// The leading '-' is to reverse sort
results.sort_by_key(|item| match item {
ThreadSummaryCursor::Newsreader(_, ts) => -ts.timestamp,
ThreadSummaryCursor::Notmuch(_, ts) => -ts.timestamp,
});
let ids = results
.into_iter()
.map(|r| match r {
ThreadSummaryCursor::Newsreader(_, ts) => ts.thread,
ThreadSummaryCursor::Notmuch(_, ts) => ts.thread,
})
.collect();
Ok(ids)
}

962
server/src/lib.rs Normal file
View File

@@ -0,0 +1,962 @@
pub mod config;
pub mod error;
pub mod graphql;
pub mod newsreader;
pub mod nm;
pub mod ws;
#[cfg(feature = "tantivy")]
pub mod tantivy;
use std::{
collections::{HashMap, HashSet},
convert::Infallible,
fmt,
str::FromStr,
sync::Arc,
};
use async_trait::async_trait;
use cacher::{Cacher, FilesystemCacher};
use css_inline::{CSSInliner, InlineError, InlineOptions};
pub use error::ServerError;
use linkify::{LinkFinder, LinkKind};
use log::{debug, error, info, warn};
use lol_html::{
element, errors::RewritingError, html_content::ContentType, rewrite_str, text,
RewriteStrSettings,
};
use maplit::{hashmap, hashset};
use regex::Regex;
use reqwest::StatusCode;
use scraper::{Html, Selector};
use sqlx::types::time::PrimitiveDateTime;
use thiserror::Error;
use url::Url;
use crate::{
graphql::{Corpus, ThreadSummary},
newsreader::is_newsreader_thread,
nm::is_notmuch_thread_or_id,
};
const NEWSREADER_TAG_PREFIX: &'static str = "News/";
const NEWSREADER_THREAD_PREFIX: &'static str = "news:";
// TODO: figure out how to use Cow
#[async_trait]
trait Transformer: Send + Sync {
fn should_run(&self, _addr: &Option<Url>, _html: &str) -> bool {
true
}
// TODO: should html be something like `html_escape` uses:
// <S: ?Sized + AsRef<str>>(text: &S) -> Cow<str>
async fn transform(&self, addr: &Option<Url>, html: &str) -> Result<String, TransformError>;
}
// TODO: how would we make this more generic to allow good implementations of Transformer outside
// of this module?
#[derive(Error, Debug)]
pub enum TransformError {
#[error("lol-html rewrite error: {0}")]
RewritingError(#[from] RewritingError),
#[error("css inline error: {0}")]
InlineError(#[from] InlineError),
#[error("failed to fetch url error: {0}")]
ReqwestError(#[from] reqwest::Error),
#[error("failed to parse HTML: {0}")]
HtmlParsingError(String),
#[error("got a retryable error code {0} for {1}")]
RetryableHttpStatusError(StatusCode, String),
}
struct SanitizeHtml<'a> {
cid_prefix: &'a str,
base_url: &'a Option<Url>,
}
#[async_trait]
impl<'a> Transformer for SanitizeHtml<'a> {
async fn transform(&self, _: &Option<Url>, html: &str) -> Result<String, TransformError> {
Ok(sanitize_html(html, self.cid_prefix, self.base_url)?)
}
}
struct EscapeHtml;
#[async_trait]
impl Transformer for EscapeHtml {
fn should_run(&self, _: &Option<Url>, html: &str) -> bool {
html.contains("&")
}
async fn transform(&self, _: &Option<Url>, html: &str) -> Result<String, TransformError> {
Ok(html_escape::decode_html_entities(html).to_string())
}
}
struct StripHtml;
#[async_trait]
impl Transformer for StripHtml {
fn should_run(&self, link: &Option<Url>, html: &str) -> bool {
debug!("StripHtml should_run {link:?} {}", html.contains("<"));
// Lame test
html.contains("<")
}
async fn transform(&self, link: &Option<Url>, html: &str) -> Result<String, TransformError> {
debug!("StripHtml {link:?}");
let mut text = String::new();
let element_content_handlers = vec![
element!("style", |el| {
el.remove();
Ok(())
}),
element!("script", |el| {
el.remove();
Ok(())
}),
];
let html = rewrite_str(
html,
RewriteStrSettings {
element_content_handlers,
..RewriteStrSettings::default()
},
)?;
let element_content_handlers = vec![text!("*", |t| {
text += t.as_str();
Ok(())
})];
let _ = rewrite_str(
&html,
RewriteStrSettings {
element_content_handlers,
..RewriteStrSettings::default()
},
)?;
let re = Regex::new(r"\s+").expect("failed to parse regex");
let text = re.replace_all(&text, " ").to_string();
Ok(text)
}
}
struct InlineStyle;
#[async_trait]
impl Transformer for InlineStyle {
async fn transform(&self, _: &Option<Url>, html: &str) -> Result<String, TransformError> {
let css = concat!(
"/* chrome-default.css */\n",
include_str!("chrome-default.css"),
//"\n/* mvp.css */\n",
//include_str!("mvp.css"),
//"\n/* Xinu Specific overrides */\n",
//include_str!("custom.css"),
);
let inline_opts = InlineOptions {
inline_style_tags: true,
keep_style_tags: false,
keep_link_tags: true,
base_url: None,
load_remote_stylesheets: true,
extra_css: Some(css.into()),
preallocate_node_capacity: 32,
..InlineOptions::default()
};
//info!("HTML:\n{html}");
Ok(match CSSInliner::new(inline_opts).inline(&html) {
Ok(inlined_html) => inlined_html,
Err(err) => {
error!("failed to inline CSS: {err}");
html.to_string()
}
})
}
}
/// Process images will extract any alt or title tags on images and place them as labels below said
/// image. It also handles data-src and data-cfsrc attributes
struct FrameImages;
#[async_trait]
impl Transformer for FrameImages {
async fn transform(&self, _: &Option<Url>, html: &str) -> Result<String, TransformError> {
Ok(rewrite_str(
html,
RewriteStrSettings {
element_content_handlers: vec![
element!("img[data-src]", |el| {
let src = el
.get_attribute("data-src")
.unwrap_or("https://placehold.co/600x400".to_string());
el.set_attribute("src", &src)?;
Ok(())
}),
element!("img[data-cfsrc]", |el| {
let src = el
.get_attribute("data-cfsrc")
.unwrap_or("https://placehold.co/600x400".to_string());
el.set_attribute("src", &src)?;
Ok(())
}),
element!("img[alt], img[title]", |el| {
let src = el
.get_attribute("src")
.unwrap_or("https://placehold.co/600x400".to_string());
let alt = el.get_attribute("alt");
let title = el.get_attribute("title");
let mut frags =
vec!["<figure>".to_string(), format!(r#"<img src="{src}">"#)];
alt.map(|t| {
if !t.is_empty() {
frags.push(format!("<figcaption>Alt: {t}</figcaption>"))
}
});
title.map(|t| {
if !t.is_empty() {
frags.push(format!("<figcaption>Title: {t}</figcaption>"))
}
});
frags.push("</figure>".to_string());
el.replace(&frags.join("\n"), ContentType::Html);
Ok(())
}),
],
..RewriteStrSettings::default()
},
)?)
}
}
struct AddOutlink;
#[async_trait]
impl Transformer for AddOutlink {
fn should_run(&self, link: &Option<Url>, html: &str) -> bool {
if let Some(link) = link {
link.scheme().starts_with("http") && !html.contains(link.as_str())
} else {
false
}
}
async fn transform(&self, link: &Option<Url>, html: &str) -> Result<String, TransformError> {
if let Some(link) = link {
Ok(format!(
r#"
{html}
<div><a href="{}">View on site</a></div>
"#,
link
))
} else {
Ok(html.to_string())
}
}
}
struct SlurpContents<'c> {
cacher: &'c FilesystemCacher,
inline_css: bool,
site_selectors: HashMap<String, Vec<Selector>>,
}
impl<'c> SlurpContents<'c> {
fn get_selectors(&self, link: &Url) -> Option<&[Selector]> {
for (host, selector) in self.site_selectors.iter() {
if link.host_str().map(|h| h.contains(host)).unwrap_or(false) {
return Some(&selector);
}
}
None
}
}
#[async_trait]
impl<'c> Transformer for SlurpContents<'c> {
fn should_run(&self, link: &Option<Url>, html: &str) -> bool {
debug!("SlurpContents should_run {link:?}");
let mut will_slurp = false;
if let Some(link) = link {
will_slurp = self.get_selectors(link).is_some();
}
if !will_slurp && self.inline_css {
return InlineStyle {}.should_run(link, html);
}
will_slurp
}
async fn transform(&self, link: &Option<Url>, html: &str) -> Result<String, TransformError> {
debug!("SlurpContents {link:?}");
let retryable_status: HashSet<StatusCode> = vec![
StatusCode::UNAUTHORIZED,
StatusCode::FORBIDDEN,
StatusCode::REQUEST_TIMEOUT,
StatusCode::TOO_MANY_REQUESTS,
]
.into_iter()
.collect();
if let Some(test_link) = link {
// If SlurpContents is configured for inline CSS, but no
// configuration found for this site, use the local InlineStyle
// transform.
if self.inline_css && self.get_selectors(test_link).is_none() {
debug!("local inline CSS for {link:?}");
return InlineStyle {}.transform(link, html).await;
}
}
let Some(link) = link else {
return Ok(html.to_string());
};
let Some(selectors) = self.get_selectors(&link) else {
return Ok(html.to_string());
};
let cacher = self.cacher;
let body = if let Some(body) = cacher.get(link.as_str()) {
String::from_utf8_lossy(&body).to_string()
} else {
let resp = reqwest::get(link.as_str()).await?;
let status = resp.status();
if status.is_server_error() {
error!("status error for {link}: {status}");
return Ok(html.to_string());
}
if retryable_status.contains(&status) {
error!("retryable error for {link}: {status}");
return Ok(html.to_string());
}
if !status.is_success() {
error!("unsuccessful for {link}: {status}");
return Ok(html.to_string());
}
let body = resp.text().await?;
cacher.set(link.as_str(), body.as_bytes());
body
};
let body = Arc::new(body);
let base_url = Some(link.clone());
let body = if self.inline_css {
debug!("inlining CSS for {link}");
let inner_body = Arc::clone(&body);
let res = tokio::task::spawn_blocking(move || {
let css = concat!(
"/* chrome-default.css */\n",
include_str!("chrome-default.css"),
"\n/* vars.css */\n",
include_str!("../static/vars.css"),
//"\n/* Xinu Specific overrides */\n",
//include_str!("custom.css"),
);
let res = CSSInliner::options()
.base_url(base_url)
.extra_css(Some(std::borrow::Cow::Borrowed(css)))
.build()
.inline(&inner_body);
match res {
Ok(inlined_html) => inlined_html,
Err(err) => {
error!("failed to inline remote CSS: {err}");
Arc::into_inner(inner_body).expect("failed to take body out of Arc")
}
}
})
.await;
match res {
Ok(inlined_html) => inlined_html,
Err(err) => {
error!("failed to spawn inline remote CSS: {err}");
Arc::into_inner(body).expect("failed to take body out of Arc")
}
}
} else {
debug!("using body as-is for {link:?}");
Arc::into_inner(body).expect("failed to take body out of Arc")
};
let doc = Html::parse_document(&body);
let mut results = Vec::new();
for selector in selectors {
for frag in doc.select(&selector) {
results.push(frag.html())
// TODO: figure out how to warn if there were no hits
//warn!("couldn't find '{:?}' in {}", selector, link);
}
}
Ok(results.join("<br>"))
}
}
pub fn linkify_html(text: &str) -> String {
let mut finder = LinkFinder::new();
let finder = finder.url_must_have_scheme(false).kinds(&[LinkKind::Url]);
let mut parts = Vec::new();
for span in finder.spans(text) {
// TODO(wathiede): use Cow<str>?
match span.kind() {
// Text as-is
None => parts.push(span.as_str().to_string()),
// Wrap in anchor tag
Some(LinkKind::Url) => {
let text = span.as_str();
let schema = if text.starts_with("http") {
""
} else {
"http://"
};
let a = format!(r#"<a href="{schema}{0}">{0}</a>"#, text);
parts.push(a);
}
_ => todo!("unhandled kind: {:?}", span.kind().unwrap()),
}
}
parts.join("")
}
// html contains the content to be cleaned, and cid_prefix is used to resolve mixed part image
// referrences
pub fn sanitize_html(
html: &str,
cid_prefix: &str,
base_url: &Option<Url>,
) -> Result<String, TransformError> {
let inline_opts = InlineOptions {
inline_style_tags: true,
keep_style_tags: true,
keep_link_tags: false,
base_url: None,
load_remote_stylesheets: false,
extra_css: None,
preallocate_node_capacity: 32,
..InlineOptions::default()
};
let html = match CSSInliner::new(inline_opts).inline(&html) {
Ok(inlined_html) => inlined_html,
Err(err) => {
error!("failed to inline CSS: {err}");
html.to_string()
}
};
let mut element_content_handlers = vec![
// Remove width and height attributes on elements
element!("[width],[height]", |el| {
el.remove_attribute("width");
el.remove_attribute("height");
Ok(())
}),
// Remove width and height values from inline styles
element!("[style]", |el| {
let style = el.get_attribute("style").unwrap();
let style = style
.split(";")
.filter(|s| {
let Some((k, _)) = s.split_once(':') else {
return true;
};
match k {
"width" | "max-width" | "min-width" | "height" | "max-height"
| "min-height" => false,
_ => true,
}
})
.collect::<Vec<_>>()
.join(";");
if let Err(e) = el.set_attribute("style", &style) {
error!("Failed to set style attribute: {e}");
}
Ok(())
}),
// Open links in new tab
element!("a[href]", |el| {
el.set_attribute("target", "_blank").unwrap();
Ok(())
}),
// Replace mixed part CID images with URL
element!("img[src]", |el| {
let src = el
.get_attribute("src")
.expect("src was required")
.replace("cid:", cid_prefix);
el.set_attribute("src", &src)?;
Ok(())
}),
// Only secure image URLs
element!("img[src]", |el| {
let src = el
.get_attribute("src")
.expect("src was required")
.replace("http:", "https:");
el.set_attribute("src", &src)?;
Ok(())
}),
// Add https to href with //<domain name>
element!("link[href]", |el| {
info!("found link[href] {el:?}");
let mut href = el.get_attribute("href").expect("href was required");
if href.starts_with("//") {
warn!("adding https to {href}");
href.insert_str(0, "https:");
}
el.set_attribute("href", &href)?;
Ok(())
}),
// Add https to src with //<domain name>
element!("style[src]", |el| {
let mut src = el.get_attribute("src").expect("src was required");
if src.starts_with("//") {
src.insert_str(0, "https:");
}
el.set_attribute("src", &src)?;
Ok(())
}),
];
if let Some(base_url) = base_url {
element_content_handlers.extend(vec![
// Make links with relative URLs absolute
element!("a[href]", |el| {
if let Some(Ok(href)) = el.get_attribute("href").map(|href| base_url.join(&href)) {
el.set_attribute("href", &href.as_str()).unwrap();
}
Ok(())
}),
// Make images with relative srcs absolute
element!("img[src]", |el| {
if let Some(Ok(src)) = el.get_attribute("src").map(|src| base_url.join(&src)) {
el.set_attribute("src", &src.as_str()).unwrap();
}
Ok(())
}),
]);
}
let html = rewrite_str(
&html,
RewriteStrSettings {
element_content_handlers,
..RewriteStrSettings::default()
},
)?;
// Default's don't allow style, but we want to preserve that.
// TODO: remove 'class' if rendering mails moves to a two phase process where abstract message
// types are collected, santized, and then grouped together as one big HTML doc
let attributes = hashset![
"align", "bgcolor", "class", "color", "height", "lang", "title", "width", "style",
];
let tags = hashset![
"a",
"abbr",
"acronym",
"area",
"article",
"aside",
"b",
"bdi",
"bdo",
"blockquote",
"br",
"caption",
"center",
"cite",
"code",
"col",
"colgroup",
"data",
"dd",
"del",
"details",
"dfn",
"div",
"dl",
"dt",
"em",
"figcaption",
"figure",
"footer",
"h1",
"h2",
"h3",
"h4",
"h5",
"h6",
"header",
"hgroup",
"hr",
"i",
"iframe", // wathiede
"img",
"ins",
"kbd",
"kbd",
"li",
"map",
"mark",
"nav",
"noscript", // wathiede
"ol",
"p",
"pre",
"q",
"rp",
"rt",
"rtc",
"ruby",
"s",
"samp",
"small",
"span",
"strike",
"strong",
"sub",
"summary",
"sup",
"table",
"tbody",
"td",
"th",
"thead",
"time",
"title", // wathiede
"tr",
"tt",
"u",
"ul",
"var",
"wbr",
];
let tag_attributes = hashmap![
"a" => hashset![
"href", "hreflang", "target",
],
"bdo" => hashset![
"dir"
],
"blockquote" => hashset![
"cite"
],
"col" => hashset![
"align", "char", "charoff", "span"
],
"colgroup" => hashset![
"align", "char", "charoff", "span"
],
"del" => hashset![
"cite", "datetime"
],
"hr" => hashset![
"align", "size", "width"
],
"iframe" => hashset![
"src", "allow", "allowfullscreen"
],
"img" => hashset![
"align", "alt", "height", "src", "width"
],
"ins" => hashset![
"cite", "datetime"
],
"ol" => hashset![
"start"
],
"q" => hashset![
"cite"
],
"table" => hashset![
"align", "border", "cellpadding", "cellspacing", "char", "charoff", "summary",
],
"tbody" => hashset![
"align", "char", "charoff"
],
"td" => hashset![
"align", "char", "charoff", "colspan", "headers", "rowspan"
],
"tfoot" => hashset![
"align", "char", "charoff"
],
"th" => hashset![
"align", "char", "charoff", "colspan", "headers", "rowspan", "scope"
],
"thead" => hashset![
"align", "char", "charoff"
],
"tr" => hashset![
"align", "char", "charoff"
],
];
let html = ammonia::Builder::default()
.tags(tags)
.tag_attributes(tag_attributes)
.generic_attributes(attributes)
.clean(&html)
.to_string();
Ok(html)
}
fn compute_offset_limit(
after: Option<i32>,
before: Option<i32>,
first: Option<i32>,
last: Option<i32>,
) -> (i32, i32) {
let default_page_size = 10000;
match (after, before, first, last) {
// Reasonable defaults
(None, None, None, None) => (0, default_page_size),
(None, None, Some(first), None) => (0, first),
(Some(after), None, None, None) => (after + 1, default_page_size),
(Some(after), None, Some(first), None) => (after + 1, first),
(None, Some(before), None, None) => (0.max(before - default_page_size), default_page_size),
(None, Some(before), None, Some(last)) => (0.max(before - last), last),
(None, None, None, Some(_)) => {
panic!("specifying last and no before doesn't make sense")
}
(None, None, Some(_), Some(_)) => {
panic!("specifying first and last doesn't make sense")
}
(None, Some(_), Some(_), _) => {
panic!("specifying before and first doesn't make sense")
}
(Some(_), Some(_), _, _) => {
panic!("specifying after and before doesn't make sense")
}
(Some(_), None, None, Some(_)) => {
panic!("specifying after and last doesn't make sense")
}
(Some(_), None, Some(_), Some(_)) => {
panic!("specifying after, first and last doesn't make sense")
}
}
}
#[derive(Debug, Default)]
pub struct Query {
pub unread_only: bool,
pub tags: Vec<String>,
pub uids: Vec<String>,
pub remainder: Vec<String>,
pub is_notmuch: bool,
pub is_newsreader: bool,
pub is_tantivy: bool,
pub corpus: Option<Corpus>,
}
impl fmt::Display for Query {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> Result<(), std::fmt::Error> {
if self.unread_only {
write!(f, "is:unread ")?;
}
for tag in &self.tags {
write!(f, "tag:{tag} ")?;
}
for uid in &self.uids {
write!(f, "id:{uid} ")?;
}
if self.is_notmuch {
write!(f, "is:mail ")?;
}
if self.is_newsreader {
write!(f, "is:newsreader ")?;
}
if self.is_newsreader {
write!(f, "is:news ")?;
}
match self.corpus {
Some(c) => write!(f, "corpus:{c:?}")?,
_ => (),
}
for rem in &self.remainder {
write!(f, "{rem} ")?;
}
Ok(())
}
}
impl Query {
// Converts the internal state of Query to something suitable for notmuch queries. Removes and
// letterbox specific '<key>:<value' tags
fn to_notmuch(&self) -> String {
let mut parts = Vec::new();
if !self.is_notmuch {
return String::new();
}
if self.unread_only {
parts.push("is:unread".to_string());
}
for tag in &self.tags {
parts.push(format!("tag:{tag}"));
}
for uid in &self.uids {
parts.push(uid.clone());
}
for r in &self.remainder {
// Rewrite "to:" to include ExtraTo:. ExtraTo: is configured in
// notmuch-config to index Delivered-To and X-Original-To headers.
if r.starts_with("to:") {
parts.push("(".to_string());
parts.push(r.to_string());
parts.push("OR".to_string());
parts.push(r.replace("to:", "ExtraTo:"));
parts.push(")".to_string());
} else {
parts.push(r.to_string());
}
}
parts.join(" ")
}
}
impl FromStr for Query {
type Err = Infallible;
fn from_str(s: &str) -> Result<Self, Self::Err> {
let mut unread_only = false;
let mut tags = Vec::new();
let mut uids = Vec::new();
let mut remainder = Vec::new();
let mut is_notmuch = false;
let mut is_newsreader = false;
let mut is_tantivy = false;
let mut corpus = None;
for word in s.split_whitespace() {
if word == "is:unread" {
unread_only = true
} else if word.starts_with("tag:") {
let t = &word["tag:".len()..];
// Per-address emails are faked as `tag:@<domain>/<username>`, rewrite to `to:` form
if t.starts_with('@') && t.contains('.') {
let t = match t.split_once('/') {
None => format!("to:{t}"),
Some((domain, user)) => format!("to:{user}{domain}"),
};
remainder.push(t);
} else {
tags.push(t.to_string());
};
/*
} else if word.starts_with("tag:") {
// Any tag that doesn't match site_prefix should explicitly set the site to something not in the
// database
site = Some(NON_EXISTENT_SITE_NAME.to_string());
*/
} else if word.starts_with("corpus:") {
let c = word["corpus:".len()..].to_string();
corpus = c.parse::<Corpus>().map(|c| Some(c)).unwrap_or_else(|e| {
warn!("Error parsing corpus '{c}': {e:?}");
None
});
} else if is_newsreader_thread(word) {
uids.push(word.to_string());
} else if is_notmuch_thread_or_id(word) {
uids.push(word.to_string());
} else if word == "is:mail" || word == "is:email" || word == "is:notmuch" {
is_notmuch = true;
} else if word == "is:news" {
is_newsreader = true;
} else if word == "is:newsreader" {
is_newsreader = true;
} else {
remainder.push(word.to_string());
}
}
// If we don't see any explicit filters for a corpus, flip them all on
if corpus.is_none() && !(is_notmuch || is_tantivy || is_newsreader) {
is_notmuch = true;
is_newsreader = true;
is_tantivy = true;
}
Ok(Query {
unread_only,
tags,
uids,
remainder,
is_notmuch,
is_newsreader,
is_tantivy,
corpus,
})
}
}
pub struct ThreadSummaryRecord {
pub site: Option<String>,
pub date: Option<PrimitiveDateTime>,
pub is_read: Option<bool>,
pub title: Option<String>,
pub uid: String,
pub name: Option<String>,
pub corpus: Corpus,
}
async fn thread_summary_from_row(r: ThreadSummaryRecord) -> ThreadSummary {
let site = r.site.unwrap_or("UNKOWN TAG".to_string());
let mut tags = vec![format!("{NEWSREADER_TAG_PREFIX}{site}")];
if !r.is_read.unwrap_or(true) {
tags.push("unread".to_string());
};
let mut title = r.title.unwrap_or("NO TITLE".to_string());
title = clean_title(&title).await.expect("failed to clean title");
ThreadSummary {
thread: format!("{NEWSREADER_THREAD_PREFIX}{}", r.uid),
timestamp: r
.date
.expect("post missing date")
.assume_utc()
.unix_timestamp() as isize,
date_relative: format!("{:?}", r.date),
//date_relative: "TODO date_relative".to_string(),
matched: 0,
total: 1,
authors: r.name.unwrap_or_else(|| site.clone()),
subject: title,
tags,
corpus: r.corpus,
}
}
async fn clean_title(title: &str) -> Result<String, ServerError> {
// Make title HTML so html parsers work
let mut title = format!("<html>{title}</html>");
let title_tranformers: Vec<Box<dyn Transformer>> =
vec![Box::new(EscapeHtml), Box::new(StripHtml)];
// Make title HTML so html parsers work
title = format!("<html>{title}</html>");
for t in title_tranformers.iter() {
if t.should_run(&None, &title) {
title = t.transform(&None, &title).await?;
}
}
Ok(title)
}
#[cfg(test)]
mod tests {
use super::{SanitizeHtml, Transformer};
#[tokio::test]
async fn strip_sizes() -> Result<(), Box<dyn std::error::Error>> {
let ss = SanitizeHtml {
cid_prefix: "",
base_url: &None,
};
let input = r#"<p width=16 height=16 style="color:blue;width:16px;height:16px;">This el has width and height attributes and inline styles</p>"#;
let want = r#"<p style="color:blue;">This el has width and height attributes and inline styles</p>"#;
let got = ss.transform(&None, input).await?;
assert_eq!(got, want);
Ok(())
}
}

View File

@@ -1,164 +0,0 @@
#[macro_use]
extern crate rocket;
mod error;
mod nm;
use std::{error::Error, io::Cursor, str::FromStr};
use glog::Flags;
use notmuch::{Notmuch, NotmuchError};
use rocket::{
http::{ContentType, Header},
request::Request,
response::{Debug, Responder},
serde::json::Json,
Response, State,
};
use rocket_cors::{AllowedHeaders, AllowedOrigins};
use crate::error::ServerError;
#[get("/")]
fn hello() -> &'static str {
"Hello, world!"
}
#[get("/refresh")]
async fn refresh(nm: &State<Notmuch>) -> Result<Json<String>, Debug<NotmuchError>> {
Ok(Json(String::from_utf8_lossy(&nm.new()?).to_string()))
}
#[get("/search")]
async fn search_all(
nm: &State<Notmuch>,
) -> Result<Json<shared::SearchResult>, Debug<NotmuchError>> {
search(nm, "*", None, None).await
}
#[get("/search/<query>?<page>&<results_per_page>")]
async fn search(
nm: &State<Notmuch>,
query: &str,
page: Option<usize>,
results_per_page: Option<usize>,
) -> Result<Json<shared::SearchResult>, Debug<NotmuchError>> {
let page = page.unwrap_or(0);
let results_per_page = results_per_page.unwrap_or(10);
info!(" search '{query}'");
let res = shared::SearchResult {
summary: nm.search(query, page * results_per_page, results_per_page)?,
query: query.to_string(),
page,
results_per_page,
total: nm.count(query)?,
};
Ok(Json(res))
}
#[get("/show/<query>")]
async fn show(
nm: &State<Notmuch>,
query: &str,
) -> Result<Json<Vec<shared::Message>>, Debug<ServerError>> {
let res = nm::threadset_to_messages(nm.show(query).map_err(|e| -> ServerError { e.into() })?)?;
Ok(Json(res))
}
struct PartResponder {
bytes: Vec<u8>,
filename: Option<String>,
}
impl<'r, 'o: 'r> Responder<'r, 'o> for PartResponder {
fn respond_to(self, _: &'r Request<'_>) -> rocket::response::Result<'o> {
let mut resp = Response::build();
if let Some(filename) = self.filename {
info!("filename {:?}", filename);
resp.header(Header::new(
"Content-Disposition",
format!(r#"attachment; filename="{}""#, filename),
))
.header(ContentType::Binary);
}
resp.sized_body(self.bytes.len(), Cursor::new(self.bytes))
.ok()
}
}
#[get("/original/<id>/part/<part>")]
async fn original_part(
nm: &State<Notmuch>,
id: &str,
part: usize,
) -> Result<PartResponder, Debug<NotmuchError>> {
let mid = if id.starts_with("id:") {
id.to_string()
} else {
format!("id:{}", id)
};
let meta = nm.show_part(&mid, part)?;
let res = nm.show_original_part(&mid, part)?;
Ok(PartResponder {
bytes: res,
filename: meta.filename,
})
}
#[get("/original/<id>")]
async fn original(
nm: &State<Notmuch>,
id: &str,
) -> Result<(ContentType, Vec<u8>), Debug<NotmuchError>> {
let mid = if id.starts_with("id:") {
id.to_string()
} else {
format!("id:{}", id)
};
let res = nm.show_original(&mid)?;
Ok((ContentType::Plain, res))
}
#[rocket::main]
async fn main() -> Result<(), Box<dyn Error>> {
glog::new()
.init(Flags {
colorlogtostderr: true,
//alsologtostderr: true, // use logtostderr to only write to stderr and not to files
logtostderr: true,
..Default::default()
})
.unwrap();
let allowed_origins = AllowedOrigins::all();
let cors = rocket_cors::CorsOptions {
allowed_origins,
allowed_methods: vec!["Get"]
.into_iter()
.map(|s| FromStr::from_str(s).unwrap())
.collect(),
allowed_headers: AllowedHeaders::some(&["Authorization", "Accept"]),
allow_credentials: true,
..Default::default()
}
.to_cors()?;
let _ = rocket::build()
.mount(
"/",
routes![
original_part,
original,
hello,
refresh,
search_all,
search,
show
],
)
.attach(cors)
.manage(Notmuch::default())
//.manage(Notmuch::with_config("../notmuch/testdata/notmuch.config"))
.launch()
.await?;
Ok(())
}

498
server/src/mvp.css Normal file
View File

@@ -0,0 +1,498 @@
/* MVP.css v1.15 - https://github.com/andybrewer/mvp */
/* :root content stored in client side index.html */
html {
scroll-behavior: smooth;
}
@media (prefers-reduced-motion: reduce) {
html {
scroll-behavior: auto;
}
}
/* Layout */
article aside {
background: var(--color-secondary-accent);
border-left: 4px solid var(--color-secondary);
padding: 0.01rem 0.8rem;
}
body {
background: var(--color-bg);
color: var(--color-text);
font-family: var(--font-family);
line-height: var(--line-height);
margin: 0;
overflow-x: hidden;
padding: 0;
}
footer,
header,
main {
margin: 0 auto;
max-width: var(--width-content);
padding: 3rem 1rem;
}
hr {
background-color: var(--color-bg-secondary);
border: none;
height: 1px;
margin: 4rem 0;
width: 100%;
}
section {
display: flex;
flex-wrap: wrap;
justify-content: var(--justify-important);
}
section img,
article img {
max-width: 100%;
}
section pre {
overflow: auto;
}
section aside {
border: 1px solid var(--color-bg-secondary);
border-radius: var(--border-radius);
box-shadow: var(--box-shadow) var(--color-shadow);
margin: 1rem;
padding: 1.25rem;
width: var(--width-card);
}
section aside:hover {
box-shadow: var(--box-shadow) var(--color-bg-secondary);
}
[hidden] {
display: none;
}
/* Headers */
article header,
div header,
main header {
padding-top: 0;
}
header {
text-align: var(--justify-important);
}
header a b,
header a em,
header a i,
header a strong {
margin-left: 0.5rem;
margin-right: 0.5rem;
}
header nav img {
margin: 1rem 0;
}
section header {
padding-top: 0;
width: 100%;
}
/* Nav */
nav {
align-items: center;
display: flex;
font-weight: bold;
justify-content: space-between;
margin-bottom: 7rem;
}
nav ul {
list-style: none;
padding: 0;
}
nav ul li {
display: inline-block;
margin: 0 0.5rem;
position: relative;
text-align: left;
}
/* Nav Dropdown */
nav ul li:hover ul {
display: block;
}
nav ul li ul {
background: var(--color-bg);
border: 1px solid var(--color-bg-secondary);
border-radius: var(--border-radius);
box-shadow: var(--box-shadow) var(--color-shadow);
display: none;
height: auto;
left: -2px;
padding: .5rem 1rem;
position: absolute;
top: 1.7rem;
white-space: nowrap;
width: auto;
z-index: 1;
}
nav ul li ul::before {
/* fill gap above to make mousing over them easier */
content: "";
position: absolute;
left: 0;
right: 0;
top: -0.5rem;
height: 0.5rem;
}
nav ul li ul li,
nav ul li ul li a {
display: block;
}
/* Typography */
code,
samp {
background-color: var(--color-accent);
border-radius: var(--border-radius);
color: var(--color-text);
display: inline-block;
margin: 0 0.1rem;
padding: 0 0.5rem;
}
details {
margin: 1.3rem 0;
}
details summary {
font-weight: bold;
cursor: pointer;
}
h1,
h2,
h3,
h4,
h5,
h6 {
line-height: var(--line-height);
text-wrap: balance;
}
mark {
padding: 0.1rem;
}
ol li,
ul li {
padding: 0.2rem 0;
}
p {
margin: 0.75rem 0;
padding: 0;
width: 100%;
}
pre {
margin: 1rem 0;
max-width: var(--width-card-wide);
padding: 1rem 0;
}
pre code,
pre samp {
display: block;
max-width: var(--width-card-wide);
padding: 0.5rem 2rem;
white-space: pre-wrap;
}
small {
color: var(--color-text-secondary);
}
sup {
background-color: var(--color-secondary);
border-radius: var(--border-radius);
color: var(--color-bg);
font-size: xx-small;
font-weight: bold;
margin: 0.2rem;
padding: 0.2rem 0.3rem;
position: relative;
top: -2px;
}
/* Links */
a {
color: var(--color-link);
display: inline-block;
font-weight: bold;
text-decoration: underline;
}
a:hover {
filter: brightness(var(--hover-brightness));
}
a:active {
filter: brightness(var(--active-brightness));
}
a b,
a em,
a i,
a strong,
button,
input[type="submit"] {
border-radius: var(--border-radius);
display: inline-block;
font-size: medium;
font-weight: bold;
line-height: var(--line-height);
margin: 0.5rem 0;
padding: 1rem 2rem;
}
button,
input[type="submit"] {
font-family: var(--font-family);
}
button:hover,
input[type="submit"]:hover {
cursor: pointer;
filter: brightness(var(--hover-brightness));
}
button:active,
input[type="submit"]:active {
filter: brightness(var(--active-brightness));
}
a b,
a strong,
button,
input[type="submit"] {
background-color: var(--color-link);
border: 2px solid var(--color-link);
color: var(--color-bg);
}
a em,
a i {
border: 2px solid var(--color-link);
border-radius: var(--border-radius);
color: var(--color-link);
display: inline-block;
padding: 1rem 2rem;
}
article aside a {
color: var(--color-secondary);
}
/* Images */
figure {
margin: 0;
padding: 0;
}
figure img {
max-width: 100%;
}
figure figcaption {
color: var(--color-text-secondary);
}
/* Forms */
button:disabled,
input:disabled {
background: var(--color-bg-secondary);
border-color: var(--color-bg-secondary);
color: var(--color-text-secondary);
cursor: not-allowed;
}
button[disabled]:hover,
input[type="submit"][disabled]:hover {
filter: none;
}
form {
border: 1px solid var(--color-bg-secondary);
border-radius: var(--border-radius);
box-shadow: var(--box-shadow) var(--color-shadow);
display: block;
max-width: var(--width-card-wide);
min-width: var(--width-card);
padding: 1.5rem;
text-align: var(--justify-normal);
}
form header {
margin: 1.5rem 0;
padding: 1.5rem 0;
}
input,
label,
select,
textarea {
display: block;
font-size: inherit;
max-width: var(--width-card-wide);
}
input[type="checkbox"],
input[type="radio"] {
display: inline-block;
}
input[type="checkbox"]+label,
input[type="radio"]+label {
display: inline-block;
font-weight: normal;
position: relative;
top: 1px;
}
input[type="range"] {
padding: 0.4rem 0;
}
input,
select,
textarea {
border: 1px solid var(--color-bg-secondary);
border-radius: var(--border-radius);
margin-bottom: 1rem;
padding: 0.4rem 0.8rem;
}
input[type="text"],
input[type="password"] textarea {
width: calc(100% - 1.6rem);
}
input[readonly],
textarea[readonly] {
background-color: var(--color-bg-secondary);
}
label {
font-weight: bold;
margin-bottom: 0.2rem;
}
/* Popups */
dialog {
border: 1px solid var(--color-bg-secondary);
border-radius: var(--border-radius);
box-shadow: var(--box-shadow) var(--color-shadow);
position: fixed;
top: 50%;
left: 50%;
transform: translate(-50%, -50%);
width: 50%;
z-index: 999;
}
/* Tables */
table {
border: 1px solid var(--color-bg-secondary);
border-radius: var(--border-radius);
border-spacing: 0;
display: inline-block;
max-width: 100%;
overflow-x: auto;
padding: 0;
white-space: nowrap;
}
table td,
table th,
table tr {
padding: 0.4rem 0.8rem;
text-align: var(--justify-important);
}
table thead {
background-color: var(--color-table);
border-collapse: collapse;
border-radius: var(--border-radius);
color: var(--color-bg);
margin: 0;
padding: 0;
}
table thead tr:first-child th:first-child {
border-top-left-radius: var(--border-radius);
}
table thead tr:first-child th:last-child {
border-top-right-radius: var(--border-radius);
}
table thead th:first-child,
table tr td:first-child {
text-align: var(--justify-normal);
}
table tr:nth-child(even) {
background-color: var(--color-accent);
}
/* Quotes */
blockquote {
display: block;
font-size: x-large;
line-height: var(--line-height);
margin: 1rem auto;
max-width: var(--width-card-medium);
padding: 1.5rem 1rem;
text-align: var(--justify-important);
}
blockquote footer {
color: var(--color-text-secondary);
display: block;
font-size: small;
line-height: var(--line-height);
padding: 1.5rem 0;
}
/* Scrollbars */
* {
scrollbar-width: thin;
scrollbar-color: var(--color-scrollbar) transparent;
}
*::-webkit-scrollbar {
width: 5px;
height: 5px;
}
*::-webkit-scrollbar-track {
background: transparent;
}
*::-webkit-scrollbar-thumb {
background-color: var(--color-scrollbar);
border-radius: 10px;
}

384
server/src/newsreader.rs Normal file
View File

@@ -0,0 +1,384 @@
use std::collections::HashMap;
use cacher::FilesystemCacher;
use futures::{stream::FuturesUnordered, StreamExt};
use letterbox_shared::compute_color;
use log::{error, info};
use maplit::hashmap;
use scraper::Selector;
use sqlx::postgres::PgPool;
use tracing::instrument;
use url::Url;
use crate::{
clean_title, compute_offset_limit,
error::ServerError,
graphql::{Corpus, NewsPost, Tag, Thread, ThreadSummary},
thread_summary_from_row, AddOutlink, FrameImages, Query, SanitizeHtml, SlurpContents,
StripHtml, ThreadSummaryRecord, Transformer, NEWSREADER_TAG_PREFIX, NEWSREADER_THREAD_PREFIX,
};
pub fn is_newsreader_query(query: &Query) -> bool {
query.is_newsreader || query.corpus == Some(Corpus::Newsreader)
}
pub fn is_newsreader_thread(query: &str) -> bool {
query.starts_with(NEWSREADER_THREAD_PREFIX)
}
pub fn extract_thread_id(query: &str) -> &str {
if query.starts_with(NEWSREADER_THREAD_PREFIX) {
&query[NEWSREADER_THREAD_PREFIX.len()..]
} else {
query
}
}
pub fn extract_site(tag: &str) -> &str {
&tag[NEWSREADER_TAG_PREFIX.len()..]
}
pub fn make_news_tag(tag: &str) -> String {
format!("tag:{NEWSREADER_TAG_PREFIX}{tag}")
}
fn site_from_tags(tags: &[String]) -> Option<String> {
for t in tags {
if t.starts_with(NEWSREADER_TAG_PREFIX) {
return Some(extract_site(t).to_string());
}
}
None
}
#[instrument(name = "newsreader::count", skip_all, fields(query=%query))]
pub async fn count(pool: &PgPool, query: &Query) -> Result<usize, ServerError> {
if !is_newsreader_query(query) {
return Ok(0);
}
let site = site_from_tags(&query.tags);
if !query.tags.is_empty() && site.is_none() {
// Newsreader can only handle all sites read/unread queries, anything with a non-site tag
// isn't supported
return Ok(0);
}
let search_term = query.remainder.join(" ");
let search_term = search_term.trim();
let search_term = if search_term.is_empty() {
None
} else {
Some(search_term)
};
// TODO: add support for looking for search_term in title and site
let row = sqlx::query_file!("sql/count.sql", site, query.unread_only, search_term)
.fetch_one(pool)
.await?;
Ok(row.count.unwrap_or(0).try_into().unwrap_or(0))
}
#[instrument(name = "newsreader::search", skip_all, fields(query=%query))]
pub async fn search(
pool: &PgPool,
after: Option<i32>,
before: Option<i32>,
first: Option<i32>,
last: Option<i32>,
query: &Query,
) -> Result<Vec<(i32, ThreadSummary)>, async_graphql::Error> {
info!("search({after:?} {before:?} {first:?} {last:?} {query:?}");
if !is_newsreader_query(query) {
return Ok(Vec::new());
}
let site = site_from_tags(&query.tags);
if !query.tags.is_empty() && site.is_none() {
// Newsreader can only handle all sites read/unread queries, anything with a non-site tag
// isn't supported
return Ok(Vec::new());
}
let (offset, mut limit) = compute_offset_limit(after, before, first, last);
if before.is_none() {
// When searching forward, the +1 is to see if there are more pages of data available.
// Searching backwards implies there's more pages forward, because the value represented by
// `before` is on the next page.
limit = limit + 1;
}
info!(
"search offset {offset} limit {limit} site {site:?} unread_only {}",
query.unread_only
);
let search_term = query.remainder.join(" ");
let search_term = search_term.trim();
let search_term = if search_term.is_empty() {
None
} else {
Some(search_term)
};
// TODO: add support for looking for search_term in title and site
let rows = sqlx::query_file!(
"sql/threads.sql",
site,
query.unread_only,
offset as i64,
limit as i64,
search_term
)
.fetch_all(pool)
.await?;
let mut res = Vec::new();
for (i, r) in rows.into_iter().enumerate() {
res.push((
i as i32 + offset,
thread_summary_from_row(ThreadSummaryRecord {
site: r.site,
date: r.date,
is_read: r.is_read,
title: r.title,
uid: r.uid,
name: r.name,
corpus: Corpus::Newsreader,
})
.await,
));
}
Ok(res)
}
#[instrument(name = "newsreader::tags", skip_all, fields(needs_unread=%_needs_unread))]
pub async fn tags(pool: &PgPool, _needs_unread: bool) -> Result<Vec<Tag>, ServerError> {
// TODO: optimize query by using needs_unread
let tags = sqlx::query_file!("sql/tags.sql").fetch_all(pool).await?;
let tags = tags
.into_iter()
.map(|tag| {
let unread = tag.unread.unwrap_or(0).try_into().unwrap_or(0);
let name = format!(
"{NEWSREADER_TAG_PREFIX}{}",
tag.site.expect("tag must have site")
);
let hex = compute_color(&name);
Tag {
name,
fg_color: "white".to_string(),
bg_color: hex,
unread,
}
})
.collect();
Ok(tags)
}
#[instrument(name = "newsreader::thread", skip_all, fields(thread_id=%thread_id))]
pub async fn thread(
cacher: &FilesystemCacher,
pool: &PgPool,
thread_id: String,
) -> Result<Thread, ServerError> {
let id = thread_id
.strip_prefix(NEWSREADER_THREAD_PREFIX)
.expect("news thread doesn't start with '{NEWSREADER_THREAD_PREFIX}'")
.to_string();
let r = sqlx::query_file!("sql/thread.sql", id)
.fetch_one(pool)
.await?;
let slug = r.site.unwrap_or("no-slug".to_string());
let site = r.name.unwrap_or("NO SITE".to_string());
// TODO: remove the various places that have this as an Option
let link = Some(Url::parse(&r.link)?);
let mut body = r.clean_summary.unwrap_or("NO SUMMARY".to_string());
let body_transformers: Vec<Box<dyn Transformer>> = vec![
Box::new(SlurpContents {
cacher,
inline_css: true,
site_selectors: slurp_contents_selectors(),
}),
Box::new(FrameImages),
Box::new(AddOutlink),
// TODO: causes doubling of images in cloudflare blogs
//Box::new(EscapeHtml),
Box::new(SanitizeHtml {
cid_prefix: "",
base_url: &link,
}),
];
for t in body_transformers.iter() {
if t.should_run(&link, &body) {
body = t.transform(&link, &body).await?;
}
}
let title = clean_title(&r.title.unwrap_or("NO TITLE".to_string())).await?;
let is_read = r.is_read.unwrap_or(false);
let timestamp = r
.date
.expect("post missing date")
.assume_utc()
.unix_timestamp();
Ok(Thread::News(NewsPost {
thread_id,
is_read,
slug,
site,
title,
body,
url: link
.as_ref()
.map(|url| url.to_string())
.unwrap_or("NO URL".to_string()),
timestamp,
}))
}
#[instrument(name = "newsreader::set_read_status", skip_all, fields(query=%query,unread=%unread))]
pub async fn set_read_status<'ctx>(
pool: &PgPool,
query: &Query,
unread: bool,
) -> Result<bool, ServerError> {
// TODO: make single query when query.uids.len() > 1
let uids: Vec<_> = query
.uids
.iter()
.filter(|uid| is_newsreader_thread(uid))
.map(
|uid| extract_thread_id(uid), // TODO strip prefix
)
.collect();
for uid in uids {
sqlx::query_file!("sql/set_unread.sql", !unread, uid)
.execute(pool)
.await?;
}
Ok(true)
}
#[instrument(name = "newsreader::refresh", skip_all)]
pub async fn refresh<'ctx>(pool: &PgPool, cacher: &FilesystemCacher) -> Result<bool, ServerError> {
async fn update_search_summary(
pool: &PgPool,
cacher: &FilesystemCacher,
link: String,
body: String,
id: i32,
) -> Result<(), ServerError> {
let slurp_contents = SlurpContents {
cacher,
inline_css: true,
site_selectors: slurp_contents_selectors(),
};
let strip_html = StripHtml;
info!("adding {link} to search index");
let mut body = body;
if let Ok(link) = Url::parse(&link) {
let link = Some(link);
if slurp_contents.should_run(&link, &body) {
body = slurp_contents.transform(&link, &body).await?;
}
} else {
error!("failed to parse link: {}", link);
}
body = strip_html.transform(&None, &body).await?;
sqlx::query!(
"UPDATE post SET search_summary = $1 WHERE id = $2",
body,
id
)
.execute(pool)
.await?;
Ok(())
}
let mut unordered: FuturesUnordered<_> = sqlx::query_file!("sql/need-search-summary.sql",)
.fetch_all(pool)
.await?
.into_iter()
.filter_map(|r| {
let Some(body) = r.clean_summary else {
error!("clean_summary missing for {}", r.link);
return None;
};
let id = r.id;
Some(update_search_summary(pool, cacher, r.link, body, id))
})
.collect();
while let Some(res) = unordered.next().await {
//let res = res;
match res {
Ok(()) => {}
Err(err) => {
info!("failed refresh {err:?}");
// TODO:
//fd.error = Some(err);
}
};
}
Ok(true)
}
fn slurp_contents_selectors() -> HashMap<String, Vec<Selector>> {
hashmap![
"atmeta.com".to_string() => vec![
Selector::parse("div.entry-content").unwrap(),
],
"blog.prusa3d.com".to_string() => vec![
Selector::parse("article.content .post-block").unwrap(),
],
"blog.cloudflare.com".to_string() => vec![
Selector::parse(".author-lists .author-name-tooltip").unwrap(),
Selector::parse(".post-full-content").unwrap()
],
"blog.zsa.io".to_string() => vec![
Selector::parse("section.blog-article").unwrap(),
],
"engineering.fb.com".to_string() => vec![
Selector::parse("article").unwrap(),
],
"grafana.com".to_string() => vec![
Selector::parse(".blog-content").unwrap(),
],
"hackaday.com".to_string() => vec![
Selector::parse("div.entry-featured-image").unwrap(),
Selector::parse("div.entry-content").unwrap()
],
"ingowald.blog".to_string() => vec![
Selector::parse("article").unwrap(),
],
"jvns.ca".to_string() => vec![
Selector::parse("article").unwrap(),
],
"mitchellh.com".to_string() => vec![Selector::parse("div.w-full").unwrap()],
"natwelch.com".to_string() => vec![
Selector::parse("article div.prose").unwrap(),
],
"rustacean-station.org".to_string() => vec![
Selector::parse("article").unwrap(),
],
"slashdot.org".to_string() => vec![
Selector::parse("span.story-byline").unwrap(),
Selector::parse("div.p").unwrap(),
],
"theonion.com".to_string() => vec![
// Single image joke w/ title
Selector::parse("article > section > div > figure").unwrap(),
// Single cartoon
Selector::parse("article > div > div > figure").unwrap(),
// Image at top of article
Selector::parse("article > header > div > div > figure").unwrap(),
// Article body
Selector::parse("article .entry-content > *").unwrap(),
],
"trofi.github.io".to_string() => vec![
Selector::parse("#content").unwrap(),
],
"www.redox-os.org".to_string() => vec![
Selector::parse("div.content").unwrap(),
],
"www.smbc-comics.com".to_string() => vec![
Selector::parse("img#cc-comic").unwrap(),
Selector::parse("div#aftercomic img").unwrap(),
],
]
}

927
server/src/nm.rs Normal file
View File

@@ -0,0 +1,927 @@
use std::{collections::HashMap, fs::File};
use letterbox_notmuch::Notmuch;
use letterbox_shared::compute_color;
use log::{error, info, warn};
use mailparse::{parse_content_type, parse_mail, MailHeader, MailHeaderMap, ParsedMail};
use memmap::MmapOptions;
use sqlx::PgPool;
use tracing::instrument;
use crate::{
compute_offset_limit,
error::ServerError,
graphql::{
Attachment, Body, Corpus, DispositionType, Email, EmailThread, Header, Html, Message,
PlainText, Tag, Thread, ThreadSummary, UnhandledContentType,
},
linkify_html, InlineStyle, Query, SanitizeHtml, Transformer,
};
const IMAGE_JPEG: &'static str = "image/jpeg";
const IMAGE_PJPEG: &'static str = "image/pjpeg";
const IMAGE_PNG: &'static str = "image/png";
const MESSAGE_RFC822: &'static str = "message/rfc822";
const MULTIPART_ALTERNATIVE: &'static str = "multipart/alternative";
const MULTIPART_MIXED: &'static str = "multipart/mixed";
const MULTIPART_RELATED: &'static str = "multipart/related";
const TEXT_HTML: &'static str = "text/html";
const TEXT_PLAIN: &'static str = "text/plain";
const MAX_RAW_MESSAGE_SIZE: usize = 100_000;
fn is_notmuch_query(query: &Query) -> bool {
query.is_notmuch || query.corpus == Some(Corpus::Notmuch)
}
pub fn is_notmuch_thread_or_id(id: &str) -> bool {
id.starts_with("id:") || id.starts_with("thread:")
}
// TODO(wathiede): decide good error type
pub fn threadset_to_messages(
thread_set: letterbox_notmuch::ThreadSet,
) -> Result<Vec<Message>, ServerError> {
for t in thread_set.0 {
for _tn in t.0 {}
}
Ok(Vec::new())
}
#[instrument(name="nm::count", skip_all, fields(query=%query))]
pub async fn count(nm: &Notmuch, query: &Query) -> Result<usize, ServerError> {
if !is_notmuch_query(query) {
return Ok(0);
}
let query = query.to_notmuch();
Ok(nm.count(&query)?)
}
#[instrument(name="nm::search", skip_all, fields(query=%query))]
pub async fn search(
nm: &Notmuch,
after: Option<i32>,
before: Option<i32>,
first: Option<i32>,
last: Option<i32>,
query: &Query,
) -> Result<Vec<(i32, ThreadSummary)>, async_graphql::Error> {
if !is_notmuch_query(query) {
return Ok(Vec::new());
}
let query = query.to_notmuch();
let (offset, mut limit) = compute_offset_limit(after, before, first, last);
if before.is_none() {
// When searching forward, the +1 is to see if there are more pages of data available.
// Searching backwards implies there's more pages forward, because the value represented by
// `before` is on the next page.
limit = limit + 1;
}
Ok(nm
.search(&query, offset as usize, limit as usize)?
.0
.into_iter()
.enumerate()
.map(|(i, ts)| {
(
offset + i as i32,
ThreadSummary {
thread: format!("thread:{}", ts.thread),
timestamp: ts.timestamp,
date_relative: ts.date_relative,
matched: ts.matched,
total: ts.total,
authors: ts.authors,
subject: ts.subject,
tags: ts.tags,
corpus: Corpus::Notmuch,
},
)
})
.collect())
}
#[instrument(name="nm::tags", skip_all, fields(needs_unread=needs_unread))]
pub fn tags(nm: &Notmuch, needs_unread: bool) -> Result<Vec<Tag>, ServerError> {
let unread_msg_cnt: HashMap<String, usize> = if needs_unread {
// 10000 is an arbitrary number, if there's more than 10k unread messages, we'll
// get an inaccurate count.
nm.search("is:unread", 0, 10000)?
.0
.iter()
.fold(HashMap::new(), |mut m, ts| {
ts.tags.iter().for_each(|t| {
m.entry(t.clone()).and_modify(|c| *c += 1).or_insert(1);
});
m
})
} else {
HashMap::new()
};
let tags: Vec<_> = nm
.tags()?
.into_iter()
.map(|tag| {
let hex = compute_color(&tag);
let unread = if needs_unread {
*unread_msg_cnt.get(&tag).unwrap_or(&0)
} else {
0
};
Tag {
name: tag,
fg_color: "white".to_string(),
bg_color: hex,
unread,
}
})
.chain(
nm.unread_recipients()?
.into_iter()
.filter_map(|(name, unread)| {
let Some(idx) = name.find('@') else {
return None;
};
let name = format!("{}/{}", &name[idx..], &name[..idx]);
let bg_color = compute_color(&name);
Some(Tag {
name,
fg_color: "white".to_string(),
bg_color,
unread,
})
}),
)
.collect();
Ok(tags)
}
#[instrument(name="nm::thread", skip_all, fields(thread_id=thread_id))]
pub async fn thread(
nm: &Notmuch,
pool: &PgPool,
thread_id: String,
debug_content_tree: bool,
) -> Result<Thread, ServerError> {
// TODO(wathiede): normalize all email addresses through an address book with preferred
// display names (that default to the most commonly seen name).
let mut messages = Vec::new();
for (path, id) in std::iter::zip(nm.files(&thread_id)?, nm.message_ids(&thread_id)?) {
let tags = nm.tags_for_query(&format!("id:{id}"))?;
let file = File::open(&path)?;
let mmap = unsafe { MmapOptions::new().map(&file)? };
let m = parse_mail(&mmap)?;
let from = email_addresses(&path, &m, "from")?;
let mut from = match from.len() {
0 => None,
1 => from.into_iter().next(),
_ => {
warn!(
"Got {} from addresses in message, truncating: {:?}",
from.len(),
from
);
from.into_iter().next()
}
};
match from.as_mut() {
Some(from) => {
if let Some(addr) = from.addr.as_mut() {
let photo_url = photo_url_for_email_address(&pool, &addr).await?;
from.photo_url = photo_url;
}
}
_ => (),
}
let to = email_addresses(&path, &m, "to")?;
let cc = email_addresses(&path, &m, "cc")?;
let delivered_to = email_addresses(&path, &m, "delivered-to")?.pop();
let x_original_to = email_addresses(&path, &m, "x-original-to")?.pop();
let subject = m.headers.get_first_value("subject");
let timestamp = m
.headers
.get_first_value("date")
.and_then(|d| mailparse::dateparse(&d).ok());
let cid_prefix = letterbox_shared::urls::cid_prefix(None, &id);
let base_url = None;
let mut part_addr = Vec::new();
part_addr.push(id.to_string());
let body = match extract_body(&m, &mut part_addr)? {
Body::PlainText(PlainText { text, content_tree }) => {
let text = if text.len() > MAX_RAW_MESSAGE_SIZE {
format!(
"{}...\n\nMESSAGE WAS TRUNCATED @ {} bytes",
&text[..MAX_RAW_MESSAGE_SIZE],
MAX_RAW_MESSAGE_SIZE
)
} else {
text
};
Body::Html(Html {
html: {
let body_tranformers: Vec<Box<dyn Transformer>> = vec![
Box::new(InlineStyle),
Box::new(SanitizeHtml {
cid_prefix: &cid_prefix,
base_url: &base_url,
}),
];
let mut html = linkify_html(&text.trim_matches('\n'));
for t in body_tranformers.iter() {
if t.should_run(&None, &html) {
html = t.transform(&None, &html).await?;
}
}
format!(
r#"<p class="view-part-text-plain font-mono whitespace-pre-line">{}</p>"#,
// Trim newlines to prevent excessive white space at the beginning/end of
// presenation. Leave tabs and spaces incase plain text attempts to center a
// header on the first line.
html
)
},
content_tree: if debug_content_tree {
render_content_type_tree(&m)
} else {
content_tree
},
})
}
Body::Html(Html {
mut html,
content_tree,
}) => Body::Html(Html {
html: {
let body_tranformers: Vec<Box<dyn Transformer>> = vec![
// TODO: this breaks things like emails from calendar
//Box::new(InlineStyle),
Box::new(SanitizeHtml {
cid_prefix: &cid_prefix,
base_url: &base_url,
}),
];
for t in body_tranformers.iter() {
if t.should_run(&None, &html) {
html = t.transform(&None, &html).await?;
}
}
html
},
content_tree: if debug_content_tree {
render_content_type_tree(&m)
} else {
content_tree
},
}),
Body::UnhandledContentType(UnhandledContentType { content_tree, .. }) => {
let body_start = mmap
.windows(2)
.take(20_000)
.position(|w| w == b"\n\n")
.unwrap_or(0);
let body = mmap[body_start + 2..].to_vec();
Body::UnhandledContentType(UnhandledContentType {
text: String::from_utf8(body)?,
content_tree: if debug_content_tree {
render_content_type_tree(&m)
} else {
content_tree
},
})
}
};
let headers = m
.headers
.iter()
.map(|h| Header {
key: h.get_key(),
value: h.get_value(),
})
.collect();
// TODO(wathiede): parse message and fill out attachments
let attachments = extract_attachments(&m, &id)?;
messages.push(Message {
id: format!("id:{id}"),
from,
to,
cc,
subject,
tags,
timestamp,
headers,
body,
path,
attachments,
delivered_to,
x_original_to,
});
}
messages.reverse();
// Find the first subject that's set. After reversing the vec, this should be the oldest
// message.
let subject: String = messages
.iter()
.skip_while(|m| m.subject.is_none())
.next()
.and_then(|m| m.subject.clone())
.unwrap_or("(NO SUBJECT)".to_string());
Ok(Thread::Email(EmailThread {
thread_id,
subject,
messages,
}))
}
fn email_addresses(
_path: &str,
m: &ParsedMail,
header_name: &str,
) -> Result<Vec<Email>, ServerError> {
let mut addrs = Vec::new();
for header_value in m.headers.get_all_values(header_name) {
match mailparse::addrparse(&header_value) {
Ok(mal) => {
for ma in mal.into_inner() {
match ma {
mailparse::MailAddr::Group(gi) => {
if !gi.group_name.contains("ndisclosed") {}
}
mailparse::MailAddr::Single(s) => addrs.push(Email {
name: s.display_name,
addr: Some(s.addr),
photo_url: None,
}), //println!("Single: {s}"),
}
}
}
Err(_) => {
let v = header_value;
if v.matches('@').count() == 1 {
if v.matches('<').count() == 1 && v.ends_with('>') {
let idx = v.find('<').unwrap();
let addr = &v[idx + 1..v.len() - 1].trim();
let name = &v[..idx].trim();
addrs.push(Email {
name: Some(name.to_string()),
addr: Some(addr.to_string()),
photo_url: None,
});
}
} else {
addrs.push(Email {
name: Some(v),
addr: None,
photo_url: None,
});
}
}
}
}
Ok(addrs)
}
pub fn cid_attachment_bytes(nm: &Notmuch, id: &str, cid: &str) -> Result<Attachment, ServerError> {
let files = nm.files(id)?;
let Some(path) = files.first() else {
warn!("failed to find files for message {id}");
return Err(ServerError::PartNotFound);
};
let file = File::open(&path)?;
let mmap = unsafe { MmapOptions::new().map(&file)? };
let m = parse_mail(&mmap)?;
if let Some(attachment) = walk_attachments(&m, |sp, _cur_idx| {
info!("{cid} {:?}", get_content_id(&sp.headers));
if let Some(h_cid) = get_content_id(&sp.headers) {
let h_cid = &h_cid[1..h_cid.len() - 1];
if h_cid == cid {
let attachment = extract_attachment(&sp, id, &[]).unwrap_or(Attachment {
..Attachment::default()
});
return Some(attachment);
}
}
None
}) {
return Ok(attachment);
}
Err(ServerError::PartNotFound)
}
pub fn attachment_bytes(nm: &Notmuch, id: &str, idx: &[usize]) -> Result<Attachment, ServerError> {
let files = nm.files(id)?;
let Some(path) = files.first() else {
warn!("failed to find files for message {id}");
return Err(ServerError::PartNotFound);
};
let file = File::open(&path)?;
let mmap = unsafe { MmapOptions::new().map(&file)? };
let m = parse_mail(&mmap)?;
if let Some(attachment) = walk_attachments(&m, |sp, cur_idx| {
if cur_idx == idx {
let attachment = extract_attachment(&sp, id, idx).unwrap_or(Attachment {
..Attachment::default()
});
return Some(attachment);
}
None
}) {
return Ok(attachment);
}
Err(ServerError::PartNotFound)
}
fn extract_body(m: &ParsedMail, part_addr: &mut Vec<String>) -> Result<Body, ServerError> {
let body = m.get_body()?;
let ret = match m.ctype.mimetype.as_str() {
TEXT_PLAIN => return Ok(Body::text(body)),
TEXT_HTML => return Ok(Body::html(body)),
MULTIPART_MIXED => extract_mixed(m, part_addr),
MULTIPART_ALTERNATIVE => extract_alternative(m, part_addr),
MULTIPART_RELATED => extract_related(m, part_addr),
_ => extract_unhandled(m),
};
if let Err(err) = ret {
error!("Failed to extract body: {err:?}");
return Ok(extract_unhandled(m)?);
}
ret
}
fn extract_unhandled(m: &ParsedMail) -> Result<Body, ServerError> {
let msg = format!(
"Unhandled body content type:\n{}\n{}",
render_content_type_tree(m),
m.get_body()?,
);
Ok(Body::UnhandledContentType(UnhandledContentType {
text: msg,
content_tree: render_content_type_tree(m),
}))
}
// multipart/alternative defines multiple representations of the same message, and clients should
// show the fanciest they can display. For this program, the priority is text/html, text/plain,
// then give up.
fn extract_alternative(m: &ParsedMail, part_addr: &mut Vec<String>) -> Result<Body, ServerError> {
let handled_types = vec![
MULTIPART_ALTERNATIVE,
MULTIPART_MIXED,
MULTIPART_RELATED,
TEXT_HTML,
TEXT_PLAIN,
];
for sp in &m.subparts {
if sp.ctype.mimetype.as_str() == MULTIPART_ALTERNATIVE {
return extract_alternative(sp, part_addr);
}
}
for sp in &m.subparts {
if sp.ctype.mimetype.as_str() == MULTIPART_MIXED {
return extract_mixed(sp, part_addr);
}
}
for sp in &m.subparts {
if sp.ctype.mimetype.as_str() == MULTIPART_RELATED {
return extract_related(sp, part_addr);
}
}
for sp in &m.subparts {
if sp.ctype.mimetype.as_str() == TEXT_HTML {
let body = sp.get_body()?;
return Ok(Body::html(body));
}
}
for sp in &m.subparts {
if sp.ctype.mimetype.as_str() == TEXT_PLAIN {
let body = sp.get_body()?;
return Ok(Body::text(body));
}
}
Err(ServerError::StringError(format!(
"extract_alternative failed to find suitable subpart, searched: {:?}",
handled_types
)))
}
// multipart/mixed defines multiple types of context all of which should be presented to the user
// 'serially'.
fn extract_mixed(m: &ParsedMail, part_addr: &mut Vec<String>) -> Result<Body, ServerError> {
//todo!("add some sort of visual indicator there are unhandled types, i.e. .ics files");
let handled_types = vec![
IMAGE_JPEG,
IMAGE_PJPEG,
IMAGE_PNG,
MESSAGE_RFC822,
MULTIPART_ALTERNATIVE,
MULTIPART_RELATED,
TEXT_HTML,
TEXT_PLAIN,
];
let mut unhandled_types: Vec<_> = m
.subparts
.iter()
.map(|sp| sp.ctype.mimetype.as_str())
.filter(|mt| !handled_types.contains(&mt))
.collect();
unhandled_types.sort();
if !unhandled_types.is_empty() {
warn!("{MULTIPART_MIXED} contains the following unhandled mimetypes {unhandled_types:?}");
}
let mut parts = Vec::new();
for (idx, sp) in m.subparts.iter().enumerate() {
part_addr.push(idx.to_string());
match sp.ctype.mimetype.as_str() {
MESSAGE_RFC822 => parts.push(extract_rfc822(&sp, part_addr)?),
MULTIPART_RELATED => parts.push(extract_related(sp, part_addr)?),
MULTIPART_ALTERNATIVE => parts.push(extract_alternative(sp, part_addr)?),
TEXT_PLAIN => parts.push(Body::text(sp.get_body()?)),
TEXT_HTML => parts.push(Body::html(sp.get_body()?)),
IMAGE_PJPEG | IMAGE_JPEG | IMAGE_PNG => {
let pcd = sp.get_content_disposition();
let filename = pcd
.params
.get("filename")
.map(|s| s.clone())
.unwrap_or("".to_string());
// Only add inline images, attachments are handled as an attribute of the top level Message and rendered separate client-side.
if pcd.disposition == mailparse::DispositionType::Inline {
// TODO: make URL generation more programatic based on what the frontend has
// mapped
parts.push(Body::html(format!(
r#"<img src="/api/view/attachment/{}/{}/{filename}">"#,
part_addr[0],
part_addr
.iter()
.skip(1)
.map(|i| i.to_string())
.collect::<Vec<_>>()
.join(".")
)));
}
}
mt => parts.push(unhandled_html(MULTIPART_MIXED, mt)),
}
part_addr.pop();
}
Ok(flatten_body_parts(&parts))
}
fn unhandled_html(parent_type: &str, child_type: &str) -> Body {
Body::Html(Html {
html: format!(
r#"
<div class="p-4 error">
Unhandled mimetype {child_type} in a {parent_type} message
</div>
"#
),
content_tree: String::new(),
})
}
fn flatten_body_parts(parts: &[Body]) -> Body {
let html = parts
.iter()
.map(|p| match p {
Body::PlainText(PlainText { text, .. }) => {
format!(
r#"<p class="view-part-text-plain font-mono whitespace-pre-line">{}</p>"#,
// Trim newlines to prevent excessive white space at the beginning/end of
// presenation. Leave tabs and spaces incase plain text attempts to center a
// header on the first line.
linkify_html(&html_escape::encode_text(text).trim_matches('\n'))
)
}
Body::Html(Html { html, .. }) => html.clone(),
Body::UnhandledContentType(UnhandledContentType { text, .. }) => {
error!("text len {}", text.len());
format!(
r#"<p class="view-part-unhandled">{}</p>"#,
// Trim newlines to prevent excessive white space at the beginning/end of
// presenation. Leave tabs and spaces incase plain text attempts to center a
// header on the first line.
linkify_html(&html_escape::encode_text(text).trim_matches('\n'))
)
}
})
.collect::<Vec<_>>()
.join("\n");
info!("flatten_body_parts {}", parts.len());
Body::html(html)
}
fn extract_related(m: &ParsedMail, part_addr: &mut Vec<String>) -> Result<Body, ServerError> {
// TODO(wathiede): collect related things and change return type to new Body arm.
let handled_types = vec![
MULTIPART_ALTERNATIVE,
TEXT_HTML,
TEXT_PLAIN,
IMAGE_JPEG,
IMAGE_PJPEG,
IMAGE_PNG,
];
let mut unhandled_types: Vec<_> = m
.subparts
.iter()
.map(|sp| sp.ctype.mimetype.as_str())
.filter(|mt| !handled_types.contains(&mt))
.collect();
unhandled_types.sort();
if !unhandled_types.is_empty() {
warn!("{MULTIPART_RELATED} contains the following unhandled mimetypes {unhandled_types:?}");
}
for (i, sp) in m.subparts.iter().enumerate() {
if sp.ctype.mimetype == IMAGE_PNG
|| sp.ctype.mimetype == IMAGE_JPEG
|| sp.ctype.mimetype == IMAGE_PJPEG
{
info!("sp.ctype {:#?}", sp.ctype);
//info!("sp.headers {:#?}", sp.headers);
if let Some(cid) = sp.headers.get_first_value("Content-Id") {
let mut part_id = part_addr.clone();
part_id.push(i.to_string());
info!("cid: {cid} part_id {part_id:?}");
}
}
}
for sp in &m.subparts {
if sp.ctype.mimetype == MULTIPART_ALTERNATIVE {
return extract_alternative(m, part_addr);
}
}
for sp in &m.subparts {
if sp.ctype.mimetype == TEXT_HTML {
let body = sp.get_body()?;
return Ok(Body::html(body));
}
}
for sp in &m.subparts {
if sp.ctype.mimetype == TEXT_PLAIN {
let body = sp.get_body()?;
return Ok(Body::text(body));
}
}
Err(ServerError::StringError(format!(
"extract_related failed to find suitable subpart, searched: {:?}",
handled_types
)))
}
fn walk_attachments<T, F: Fn(&ParsedMail, &[usize]) -> Option<T> + Copy>(
m: &ParsedMail,
visitor: F,
) -> Option<T> {
let mut cur_addr = Vec::new();
walk_attachments_inner(m, visitor, &mut cur_addr)
}
fn walk_attachments_inner<T, F: Fn(&ParsedMail, &[usize]) -> Option<T> + Copy>(
m: &ParsedMail,
visitor: F,
cur_addr: &mut Vec<usize>,
) -> Option<T> {
for (idx, sp) in m.subparts.iter().enumerate() {
cur_addr.push(idx);
let val = visitor(sp, &cur_addr);
if val.is_some() {
return val;
}
let val = walk_attachments_inner(sp, visitor, cur_addr);
if val.is_some() {
return val;
}
cur_addr.pop();
}
None
}
// TODO(wathiede): make this walk_attachments that takes a closure.
// Then implement one closure for building `Attachment` and imlement another that can be used to
// get the bytes for serving attachments of HTTP
fn extract_attachments(m: &ParsedMail, id: &str) -> Result<Vec<Attachment>, ServerError> {
let mut attachments = Vec::new();
for (idx, sp) in m.subparts.iter().enumerate() {
if let Some(attachment) = extract_attachment(sp, id, &[idx]) {
// Filter out inline attachements, they're flattened into the body of the message.
if attachment.disposition == DispositionType::Attachment {
attachments.push(attachment);
}
}
}
Ok(attachments)
}
fn extract_attachment(m: &ParsedMail, id: &str, idx: &[usize]) -> Option<Attachment> {
let pcd = m.get_content_disposition();
let pct = m
.get_headers()
.get_first_value("Content-Type")
.map(|s| parse_content_type(&s));
let filename = match (
pcd.params.get("filename").map(|f| f.clone()),
pct.map(|pct| pct.params.get("name").map(|f| f.clone())),
) {
// Use filename from Content-Disposition
(Some(filename), _) => filename,
// Use filename from Content-Type
(_, Some(Some(name))) => name,
// No known filename, assume it's not an attachment
_ => return None,
};
info!("filename {filename}");
// TODO: grab this from somewhere
let content_id = None;
let bytes = match m.get_body_raw() {
Ok(bytes) => bytes,
Err(err) => {
error!("failed to get body for attachment: {err}");
return None;
}
};
return Some(Attachment {
id: id.to_string(),
idx: idx
.iter()
.map(|i| i.to_string())
.collect::<Vec<_>>()
.join("."),
disposition: pcd.disposition.into(),
filename: Some(filename),
size: bytes.len(),
// TODO: what is the default for ctype?
// TODO: do we want to use m.ctype.params for anything?
content_type: Some(m.ctype.mimetype.clone()),
content_id,
bytes,
});
}
fn email_address_strings(emails: &[Email]) -> Vec<String> {
emails
.iter()
.map(|e| e.to_string())
.inspect(|e| info!("e {e}"))
.collect()
}
fn extract_rfc822(m: &ParsedMail, part_addr: &mut Vec<String>) -> Result<Body, ServerError> {
fn extract_headers(m: &ParsedMail) -> Result<Body, ServerError> {
let path = "<in-memory>";
let from = email_address_strings(&email_addresses(path, &m, "from")?).join(", ");
let to = email_address_strings(&email_addresses(path, &m, "to")?).join(", ");
let cc = email_address_strings(&email_addresses(path, &m, "cc")?).join(", ");
let date = m.headers.get_first_value("date").unwrap_or(String::new());
let subject = m
.headers
.get_first_value("subject")
.unwrap_or(String::new());
let text = format!(
r#"
---------- Forwarded message ----------
From: {from}
To: {to}
CC: {cc}
Date: {date}
Subject: {subject}
"#
);
Ok(Body::text(text))
}
let inner_body = m.get_body()?;
let inner_m = parse_mail(inner_body.as_bytes())?;
let headers = extract_headers(&inner_m)?;
let body = extract_body(&inner_m, part_addr)?;
Ok(flatten_body_parts(&[headers, body]))
}
pub fn get_attachment_filename(header_value: &str) -> &str {
info!("get_attachment_filename {header_value}");
// Strip last "
let v = &header_value[..header_value.len() - 1];
if let Some(idx) = v.rfind('"') {
&v[idx + 1..]
} else {
""
}
}
pub fn get_content_type<'a>(headers: &[MailHeader<'a>]) -> Option<String> {
if let Some(v) = headers.get_first_value("Content-Type") {
if let Some(idx) = v.find(';') {
return Some(v[..idx].to_string());
} else {
return Some(v);
}
}
None
}
fn get_content_id<'a>(headers: &[MailHeader<'a>]) -> Option<String> {
headers.get_first_value("Content-Id")
}
fn render_content_type_tree(m: &ParsedMail) -> String {
const WIDTH: usize = 4;
const SKIP_HEADERS: [&str; 4] = [
"Authentication-Results",
"DKIM-Signature",
"Received",
"Received-SPF",
];
fn render_ct_rec(m: &ParsedMail, depth: usize) -> String {
let mut parts = Vec::new();
let msg = format!("{} {}", "-".repeat(depth * WIDTH), m.ctype.mimetype);
parts.push(msg);
for sp in &m.subparts {
parts.push(render_ct_rec(sp, depth + 1))
}
parts.join("\n")
}
fn render_rec(m: &ParsedMail, depth: usize) -> String {
let mut parts = Vec::new();
let msg = format!("{} {}", "-".repeat(depth * WIDTH), m.ctype.mimetype);
parts.push(msg);
let indent = " ".repeat(depth * WIDTH);
if !m.ctype.charset.is_empty() {
parts.push(format!("{indent} Character Set: {}", m.ctype.charset));
}
for (k, v) in m.ctype.params.iter() {
parts.push(format!("{indent} {k}: {v}"));
}
if !m.headers.is_empty() {
parts.push(format!("{indent} == headers =="));
for h in &m.headers {
if h.get_key().starts_with('X') {
continue;
}
if SKIP_HEADERS.contains(&h.get_key().as_str()) {
continue;
}
parts.push(format!("{indent} {}: {}", h.get_key_ref(), h.get_value()));
}
}
for sp in &m.subparts {
parts.push(render_rec(sp, depth + 1))
}
parts.join("\n")
}
format!(
"Outline:\n{}\n\nDetailed:\n{}\n\nNot showing headers:\n {}\n X.*",
render_ct_rec(m, 1),
render_rec(m, 1),
SKIP_HEADERS.join("\n ")
)
}
#[instrument(name="nm::set_read_status", skip_all, fields(query=%query, unread=unread))]
pub async fn set_read_status<'ctx>(
nm: &Notmuch,
query: &Query,
unread: bool,
) -> Result<bool, ServerError> {
let uids: Vec<_> = query
.uids
.iter()
.filter(|uid| is_notmuch_thread_or_id(uid))
.collect();
info!("set_read_status({unread} {uids:?})");
for uid in uids {
if unread {
nm.tag_add("unread", uid)?;
} else {
nm.tag_remove("unread", uid)?;
}
}
Ok(true)
}
async fn photo_url_for_email_address(
pool: &PgPool,
addr: &str,
) -> Result<Option<String>, ServerError> {
let row = sqlx::query!(
r#"
SELECT
url
FROM email_photo ep
JOIN email_address ea
ON ep.id = ea.email_photo_id
WHERE
address = $1
"#,
addr
)
.fetch_optional(pool)
.await?;
Ok(row.map(|r| r.url))
}

353
server/src/tantivy.rs Normal file
View File

@@ -0,0 +1,353 @@
use std::collections::HashSet;
use log::{debug, error, info, warn};
use sqlx::{postgres::PgPool, types::time::PrimitiveDateTime};
use tantivy::{
collector::{DocSetCollector, TopDocs},
doc, query,
query::{AllQuery, BooleanQuery, Occur, QueryParser, TermQuery},
schema::{Facet, IndexRecordOption, Value},
DocAddress, Index, IndexReader, Searcher, TantivyDocument, TantivyError, Term,
};
use tracing::{info_span, instrument, Instrument};
use crate::{
compute_offset_limit,
error::ServerError,
graphql::{Corpus, ThreadSummary},
newsreader::{extract_thread_id, is_newsreader_thread},
thread_summary_from_row, Query, ThreadSummaryRecord,
};
pub fn is_tantivy_query(query: &Query) -> bool {
query.is_tantivy || query.corpus == Some(Corpus::Tantivy)
}
pub struct TantivyConnection {
db_path: String,
index: Index,
reader: IndexReader,
}
fn get_index(db_path: &str) -> Result<Index, TantivyError> {
Ok(match Index::open_in_dir(db_path) {
Ok(idx) => idx,
Err(err) => {
warn!("Failed to open {db_path}: {err}");
create_news_db(db_path)?;
Index::open_in_dir(db_path)?
}
})
}
impl TantivyConnection {
pub fn new(tantivy_db_path: &str) -> Result<TantivyConnection, TantivyError> {
let index = get_index(tantivy_db_path)?;
let reader = index.reader()?;
Ok(TantivyConnection {
db_path: tantivy_db_path.to_string(),
index,
reader,
})
}
#[instrument(name = "tantivy::refresh", skip_all)]
pub async fn refresh(&self, pool: &PgPool) -> Result<(), ServerError> {
let start_time = std::time::Instant::now();
let p_uids: Vec<_> = sqlx::query_file!("sql/all-uids.sql")
.fetch_all(pool)
.instrument(info_span!("postgres query"))
.await?
.into_iter()
.map(|r| r.uid)
.collect();
info!(
"refresh from postgres got {} uids in {}",
p_uids.len(),
start_time.elapsed().as_secs_f32()
);
let t_span = info_span!("tantivy query");
let _enter = t_span.enter();
let start_time = std::time::Instant::now();
let (searcher, _query) = self.searcher_and_query(&Query::default())?;
let docs = searcher.search(&AllQuery, &DocSetCollector)?;
let uid = self.index.schema().get_field("uid")?;
let t_uids: Vec<_> = docs
.into_iter()
.map(|doc_address| {
searcher
.doc(doc_address)
.map(|doc: TantivyDocument| {
debug!("doc: {doc:#?}");
doc.get_first(uid)
.expect("uid")
.as_str()
.expect("as_str")
.to_string()
})
.expect("searcher.doc")
})
.collect();
drop(_enter);
info!(
"refresh tantivy got {} uids in {}",
t_uids.len(),
start_time.elapsed().as_secs_f32()
);
let t_set: HashSet<_> = t_uids.into_iter().collect();
let need: Vec<_> = p_uids
.into_iter()
.filter(|uid| !t_set.contains(uid.as_str()))
.collect();
if !need.is_empty() {
info!(
"need to reindex {} uids: {:?}...",
need.len(),
&need[..need.len().min(10)]
);
}
let batch_size = 1000;
let uids: Vec<_> = need[..need.len().min(batch_size)]
.into_iter()
.cloned()
.collect();
self.reindex_uids(pool, &uids).await
}
#[instrument(skip(self, pool))]
async fn reindex_uids(&self, pool: &PgPool, uids: &[String]) -> Result<(), ServerError> {
if uids.is_empty() {
return Ok(());
}
// TODO: add SlurpContents and convert HTML to text
let pool: &PgPool = pool;
let mut index_writer = self.index.writer(50_000_000)?;
let schema = self.index.schema();
let site = schema.get_field("site")?;
let title = schema.get_field("title")?;
let summary = schema.get_field("summary")?;
let link = schema.get_field("link")?;
let date = schema.get_field("date")?;
let is_read = schema.get_field("is_read")?;
let uid = schema.get_field("uid")?;
let id = schema.get_field("id")?;
let tag = schema.get_field("tag")?;
info!("reindexing {} posts", uids.len());
let rows = sqlx::query_file_as!(PostgresDoc, "sql/posts-from-uids.sql", uids)
.fetch_all(pool)
.await?;
if uids.len() != rows.len() {
error!(
"Had {} uids and only got {} rows: uids {uids:?}",
uids.len(),
rows.len()
);
}
for r in rows {
let id_term = Term::from_field_text(uid, &r.uid);
index_writer.delete_term(id_term);
let slug = r.site;
let tag_facet = Facet::from(&format!("/News/{slug}"));
index_writer.add_document(doc!(
site => slug.clone(),
title => r.title,
// TODO: clean and extract text from HTML
summary => r.summary,
link => r.link,
date => tantivy::DateTime::from_primitive(r.date),
is_read => r.is_read,
uid => r.uid,
id => r.id as u64,
tag => tag_facet,
))?;
}
info_span!("IndexWriter.commit").in_scope(|| index_writer.commit())?;
info_span!("IndexReader.reload").in_scope(|| self.reader.reload())?;
Ok(())
}
#[instrument(name = "tantivy::reindex_thread", skip_all, fields(query=%query))]
pub async fn reindex_thread(&self, pool: &PgPool, query: &Query) -> Result<(), ServerError> {
let uids: Vec<_> = query
.uids
.iter()
.filter(|uid| is_newsreader_thread(uid))
.map(|uid| extract_thread_id(uid).to_string())
.collect();
Ok(self.reindex_uids(pool, &uids).await?)
}
#[instrument(name = "tantivy::reindex_all", skip_all)]
pub async fn reindex_all(&self, pool: &PgPool) -> Result<(), ServerError> {
let rows = sqlx::query_file!("sql/all-posts.sql")
.fetch_all(pool)
.await?;
let uids: Vec<String> = rows.into_iter().map(|r| r.uid).collect();
self.reindex_uids(pool, &uids).await?;
Ok(())
}
fn searcher_and_query(
&self,
query: &Query,
) -> Result<(Searcher, Box<dyn query::Query>), ServerError> {
// TODO: only create one reader
// From https://tantivy-search.github.io/examples/basic_search.html
// "For a search server you will typically create one reader for the entire lifetime of
// your program, and acquire a new searcher for every single request."
//
// I think there's some challenge in making the reader work if we reindex, so reader my
// need to be stored indirectly, and be recreated on reindex
// I think creating a reader takes 200-300 ms.
let schema = self.index.schema();
let searcher = self.reader.searcher();
let title = schema.get_field("title")?;
let summary = schema.get_field("summary")?;
let query_parser = QueryParser::for_index(&self.index, vec![title, summary]);
// Tantivy uses '*' to match all docs, not empty string
let term = &query.remainder.join(" ");
let term = if term.is_empty() { "*" } else { term };
info!("query_parser('{term}')");
let tantivy_query = query_parser.parse_query(&term)?;
let tag = schema.get_field("tag")?;
let is_read = schema.get_field("is_read")?;
let mut terms = vec![(Occur::Must, tantivy_query)];
for t in &query.tags {
let facet = Facet::from(&format!("/{t}"));
let facet_term = Term::from_facet(tag, &facet);
let facet_term_query = Box::new(TermQuery::new(facet_term, IndexRecordOption::Basic));
terms.push((Occur::Must, facet_term_query));
}
if query.unread_only {
info!("searching for unread only");
let term = Term::from_field_bool(is_read, false);
terms.push((
Occur::Must,
Box::new(TermQuery::new(term, IndexRecordOption::Basic)),
));
}
let search_query = BooleanQuery::new(terms);
Ok((searcher, Box::new(search_query)))
}
#[instrument(name="tantivy::count", skip_all, fields(query=%query))]
pub async fn count(&self, query: &Query) -> Result<usize, ServerError> {
if !is_tantivy_query(query) {
return Ok(0);
}
info!("tantivy::count {query:?}");
use tantivy::collector::Count;
let (searcher, query) = self.searcher_and_query(&query)?;
Ok(searcher.search(&query, &Count)?)
}
#[instrument(name="tantivy::search", skip_all, fields(query=%query))]
pub async fn search(
&self,
pool: &PgPool,
after: Option<i32>,
before: Option<i32>,
first: Option<i32>,
last: Option<i32>,
query: &Query,
) -> Result<Vec<(i32, ThreadSummary)>, async_graphql::Error> {
if !is_tantivy_query(query) {
return Ok(Vec::new());
}
let (offset, mut limit) = compute_offset_limit(after, before, first, last);
if before.is_none() {
// When searching forward, the +1 is to see if there are more pages of data available.
// Searching backwards implies there's more pages forward, because the value represented by
// `before` is on the next page.
limit = limit + 1;
}
let (searcher, search_query) = self.searcher_and_query(&query)?;
info!("Tantivy::search(query '{query:?}', off {offset}, lim {limit}, search_query {search_query:?})");
let top_docs = searcher.search(
&search_query,
&TopDocs::with_limit(limit as usize)
.and_offset(offset as usize)
.order_by_u64_field("date", tantivy::index::Order::Desc),
)?;
info!("search found {} docs", top_docs.len());
let uid = self.index.schema().get_field("uid")?;
let uids = top_docs
.into_iter()
.map(|(_, doc_address): (u64, DocAddress)| {
searcher.doc(doc_address).map(|doc: TantivyDocument| {
debug!("doc: {doc:#?}");
doc.get_first(uid)
.expect("doc missing uid")
.as_str()
.expect("doc str missing")
.to_string()
})
})
.collect::<Result<Vec<String>, TantivyError>>()?;
//let uids = format!("'{}'", uids.join("','"));
info!("uids {uids:?}");
let rows = sqlx::query_file!("sql/threads-from-uid.sql", &uids as &[String])
.fetch_all(pool)
.await?;
let mut res = Vec::new();
info!("found {} hits joining w/ tantivy", rows.len());
for (i, r) in rows.into_iter().enumerate() {
res.push((
i as i32 + offset,
thread_summary_from_row(ThreadSummaryRecord {
site: r.site,
date: r.date,
is_read: r.is_read,
title: r.title,
uid: r.uid,
name: r.name,
corpus: Corpus::Tantivy,
})
.await,
));
}
Ok(res)
}
pub fn drop_and_load_index(&self) -> Result<(), TantivyError> {
create_news_db(&self.db_path)
}
}
fn create_news_db(tantivy_db_path: &str) -> Result<(), TantivyError> {
info!("create_news_db");
// Don't care if directory didn't exist
let _ = std::fs::remove_dir_all(tantivy_db_path);
std::fs::create_dir_all(tantivy_db_path)?;
use tantivy::schema::*;
let mut schema_builder = Schema::builder();
schema_builder.add_text_field("site", STRING | STORED);
schema_builder.add_text_field("title", TEXT | STORED);
schema_builder.add_text_field("summary", TEXT);
schema_builder.add_text_field("link", STRING | STORED);
schema_builder.add_date_field("date", FAST | INDEXED | STORED);
schema_builder.add_bool_field("is_read", FAST | INDEXED | STORED);
schema_builder.add_text_field("uid", STRING | STORED);
schema_builder.add_u64_field("id", FAST);
schema_builder.add_facet_field("tag", FacetOptions::default());
let schema = schema_builder.build();
Index::create_in_dir(tantivy_db_path, schema)?;
Ok(())
}
struct PostgresDoc {
site: String,
title: String,
summary: String,
link: String,
date: PrimitiveDateTime,
is_read: bool,
uid: String,
id: i32,
}

35
server/src/ws.rs Normal file
View File

@@ -0,0 +1,35 @@
use std::{collections::HashMap, net::SocketAddr};
use axum::extract::ws::{Message, WebSocket};
use letterbox_shared::WebsocketMessage;
use tracing::{info, warn};
#[derive(Default)]
pub struct ConnectionTracker {
peers: HashMap<SocketAddr, WebSocket>,
}
impl ConnectionTracker {
pub async fn add_peer(&mut self, socket: WebSocket, who: SocketAddr) {
warn!("adding {who:?} to connection tracker");
self.peers.insert(who, socket);
self.send_message_all(WebsocketMessage::RefreshMessages)
.await;
}
pub async fn send_message_all(&mut self, msg: WebsocketMessage) {
info!("send_message_all {msg}");
let m = serde_json::to_string(&msg).expect("failed to json encode WebsocketMessage");
let mut bad_peers = Vec::new();
for (who, socket) in &mut self.peers.iter_mut() {
if let Err(e) = socket.send(Message::Text(m.clone().into())).await {
warn!("{:?} is bad, scheduling for removal: {e}", who);
bad_peers.push(who.clone());
}
}
for b in bad_peers {
info!("removing bad peer {b:?}");
self.peers.remove(&b);
}
}
}

View File

@@ -0,0 +1,59 @@
<!DOCTYPE html>
<html>
<head>
<meta charset=utf-8 />
<meta name="viewport" content="user-scalable=no, initial-scale=1.0, minimum-scale=1.0, maximum-scale=1.0, minimal-ui">
<title>GraphQL Playground</title>
<link rel="stylesheet" href="//cdn.jsdelivr.net/npm/graphql-playground-react/build/static/css/index.css" />
<link rel="shortcut icon" href="//cdn.jsdelivr.net/npm/graphql-playground-react/build/favicon.png" />
<script src="//cdn.jsdelivr.net/npm/graphql-playground-react/build/static/js/middleware.js"></script>
</head>
<body>
<div id="root">
<style>
body {
background-color: rgb(23, 42, 58);
font-family: Open Sans, sans-serif;
height: 90vh;
}
#root {
height: 100%;
width: 100%;
display: flex;
align-items: center;
justify-content: center;
}
.loading {
font-size: 32px;
font-weight: 200;
color: rgba(255, 255, 255, .6);
margin-left: 20px;
}
img {
width: 78px;
height: 78px;
}
.title {
font-weight: 400;
}
</style>
<img src='//cdn.jsdelivr.net/npm/graphql-playground-react/build/logo.png' alt=''>
<div class="loading"> Loading
<span class="title">GraphQL Playground</span>
</div>
</div>
<script>window.addEventListener('load', function (event) {
GraphQLPlayground.init(document.getElementById('root'), {
// options as 'endpoint' belong here
endpoint: "/api/graphql",
})
})</script>
</body>
</html>

42
server/static/vars.css Normal file
View File

@@ -0,0 +1,42 @@
:root {
--active-brightness: 0.85;
--border-radius: 5px;
--box-shadow: 2px 2px 10px;
--color-accent: #118bee15;
--color-bg: #fff;
--color-bg-secondary: #e9e9e9;
--color-link: #118bee;
--color-secondary: #920de9;
--color-secondary-accent: #920de90b;
--color-shadow: #f4f4f4;
--color-table: #118bee;
--color-text: #000;
--color-text-secondary: #999;
--color-scrollbar: #cacae8;
--font-family: -apple-system, BlinkMacSystemFont, "Segoe UI", Roboto, Oxygen-Sans, Ubuntu, Cantarell, "Helvetica Neue", sans-serif;
--hover-brightness: 1.2;
--justify-important: center;
--justify-normal: left;
--line-height: 1.5;
/*
--width-card: 285px;
--width-card-medium: 460px;
--width-card-wide: 800px;
*/
--width-content: 1080px;
}
@media (prefers-color-scheme: dark) {
:root[color-mode="user"] {
--color-accent: #0097fc4f;
--color-bg: #333;
--color-bg-secondary: #555;
--color-link: #0097fc;
--color-secondary: #e20de9;
--color-secondary-accent: #e20de94f;
--color-shadow: #bbbbbb20;
--color-table: #0097fc;
--color-text: #f7f7f7;
--color-text-secondary: #aaa;
}
}

View File

@@ -1,10 +1,17 @@
[package]
name = "shared"
version = "0.1.0"
edition = "2021"
name = "letterbox-shared"
description = "Shared module for letterbox"
authors.workspace = true
edition.workspace = true
license.workspace = true
publish.workspace = true
repository.workspace = true
version.workspace = true
# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html
[dependencies]
notmuch = { path = "../notmuch" }
build-info = "0.0.40"
letterbox-notmuch = { version = "0.15.11", path = "../notmuch", registry = "xinu" }
serde = { version = "1.0.147", features = ["derive"] }
strum_macros = "0.27.1"

View File

@@ -1,4 +1,7 @@
use notmuch::SearchSummary;
use std::hash::{DefaultHasher, Hash, Hasher};
use build_info::{BuildInfo, VersionControl};
use letterbox_notmuch::SearchSummary;
use serde::{Deserialize, Serialize};
#[derive(Serialize, Deserialize, Debug)]
@@ -10,26 +13,55 @@ pub struct SearchResult {
pub total: usize,
}
#[derive(Serialize, Deserialize, Debug)]
pub struct ShowResult {
messages: Vec<Message>,
#[derive(Serialize, Deserialize, Debug, strum_macros::Display)]
pub enum WebsocketMessage {
RefreshMessages,
}
pub type AttachementId = String;
/// # Number of seconds since the Epoch
pub type UnixTime = isize;
#[derive(Serialize, Deserialize, Debug, Default)]
pub struct Message {
pub from: String,
pub to: Option<String>,
pub cc: Option<String>,
pub timestamp: UnixTime, // date header as unix time
pub date_relative: String, // user-friendly timestamp
pub tags: Vec<String>,
// HTML formatted body
pub body: String,
pub attachment: Vec<AttachementId>,
pub mod urls {
pub const MOUNT_POINT: &'static str = "/api";
pub fn view_original(host: Option<&str>, id: &str) -> String {
if let Some(host) = host {
format!("//{host}/api/original/{id}")
} else {
format!("/api/original/{id}")
}
}
pub fn cid_prefix(host: Option<&str>, cid: &str) -> String {
if let Some(host) = host {
format!("//{host}/api/cid/{cid}/")
} else {
format!("/api/cid/{cid}/")
}
}
pub fn download_attachment(host: Option<&str>, id: &str, idx: &str, filename: &str) -> String {
if let Some(host) = host {
format!(
"//{host}/api/download/attachment/{}/{}/{}",
id, idx, filename
)
} else {
format!("/api/download/attachment/{}/{}/{}", id, idx, filename)
}
}
}
pub fn build_version(bi: fn() -> &'static BuildInfo) -> String {
fn commit(git: &Option<VersionControl>) -> String {
let Some(VersionControl::Git(git)) = git else {
return String::new();
};
let mut s = vec!["-".to_string(), git.commit_short_id.clone()];
if let Some(branch) = &git.branch {
s.push(format!(" ({branch})"));
}
s.join("")
}
let bi = bi();
format!("v{}{}", bi.crate_info.version, commit(&bi.version_control)).to_string()
}
pub fn compute_color(data: &str) -> String {
let mut hasher = DefaultHasher::new();
data.hash(&mut hasher);
format!("#{:06x}", hasher.finish() % (1 << 24))
}

View File

@@ -1,16 +1,15 @@
[package]
version = "0.1.0"
name = "letterbox"
repository = "https://github.com/seed-rs/seed-quickstart"
authors = ["Bill Thiede <git@xinu.tv>"]
description = "App Description"
categories = ["category"]
license = "MIT"
readme = "./README.md"
edition = "2018"
name = "letterbox-web"
description = "Web frontend for letterbox"
authors.workspace = true
edition.workspace = true
license.workspace = true
publish.workspace = true
repository.workspace = true
version.workspace = true
[lib]
crate-type = ["cdylib"]
[build-dependencies]
build-info-build = "0.0.40"
[dev-dependencies]
wasm-bindgen-test = "0.3.33"
@@ -18,15 +17,29 @@ wasm-bindgen-test = "0.3.33"
[dependencies]
console_error_panic_hook = "0.1.7"
log = "0.4.17"
seed = "0.9.2"
console_log = {git = "http://git-private.h.xinu.tv/wathiede/console_log.git"}
seed = { version = "0.10.0", features = ["routing"] }
#seed = "0.9.2"
console_log = { version = "0.1.0", registry = "xinu" }
serde = { version = "1.0.147", features = ["derive"] }
notmuch = {path = "../notmuch"}
shared = {path = "../shared"}
itertools = "0.10.5"
itertools = "0.14.0"
serde_json = { version = "1.0.93", features = ["unbounded_depth"] }
wasm-timer = "0.2.5"
css-inline = "0.8.5"
chrono = "0.4.31"
graphql_client = "0.14.0"
thiserror = "2.0.0"
gloo-net = { version = "0.6.0", features = ["json", "serde_json"] }
human_format = "1.1.0"
build-info = "0.0.40"
wasm-bindgen = "=0.2.100"
uuid = { version = "1.13.1", features = [
"js",
] } # direct dep to set js feature, prevents Rng issues
letterbox-shared = { version = "0.15.11", path = "../shared", registry = "xinu" }
letterbox-notmuch = { version = "0.15.11", path = "../notmuch", registry = "xinu" }
seed_hooks = { version = "0.4.0", registry = "xinu" }
strum_macros = "0.27.1"
gloo-console = "0.3.0"
[target.'cfg(target_arch = "wasm32")'.dependencies]
wasm-sockets = "1.0.0"
[package.metadata.wasm-pack.profile.release]
wasm-opt = ['-Os']
@@ -34,6 +47,13 @@ wasm-opt = ['-Os']
[dependencies.web-sys]
version = "0.3.58"
features = [
"Clipboard",
"DomRect",
"Element",
"History",
"MediaQueryList",
"Window"
"Navigator",
"Performance",
"ScrollRestoration",
"Window",
]

View File

@@ -1,6 +0,0 @@
.PHONY: all
# Build in release mode and push to minio for serving.
all:
trunk build --release
mc mirror --overwrite --remove dist/ m/letterbox/

27
web/Trunk.toml Normal file
View File

@@ -0,0 +1,27 @@
[build]
release = false
[serve]
# The address to serve on.
address = "0.0.0.0"
port = 6758
[[proxy]]
ws = true
backend = "ws://localhost:9345/api/ws"
[[proxy]]
backend = "http://localhost:9345/api/"
[[proxy]]
backend = "http://localhost:9345/notification/"
[[hooks]]
stage = "pre_build"
command = "printf"
command_arguments = ["\\033c"]
#[[hooks]]
#stage = "pre_build"
#command = "cargo"
#command_arguments = [ "test" ]

5
web/build.rs Normal file
View File

@@ -0,0 +1,5 @@
fn main() {
// Calling `build_info_build::build_script` collects all data and makes it available to `build_info::build_info!`
// and `build_info::format!` in the main program.
build_info_build::build_script();
}

View File

@@ -0,0 +1,3 @@
mutation AddTagMutation($query: String!, $tag: String!) {
tagAdd(query:$query, tag:$tag)
}

View File

@@ -0,0 +1,3 @@
query CatchupQuery($query: String!) {
catchup(query: $query)
}

View File

@@ -0,0 +1,27 @@
query FrontPageQuery($query: String!, $after: String $before: String, $first: Int, $last: Int) {
count(query: $query)
search(query: $query, after: $after, before: $before, first: $first, last: $last) {
pageInfo {
hasPreviousPage
hasNextPage
startCursor
endCursor
}
nodes {
thread
total
timestamp
subject
authors
tags
corpus
}
}
tags {
name
bgColor
fgColor
unread
}
version
}

View File

@@ -0,0 +1,3 @@
mutation MarkReadMutation($query: String!, $unread: Boolean!) {
setReadStatus(query:$query, unread:$unread)
}

View File

@@ -0,0 +1,3 @@
mutation RefreshMutation {
refresh
}

View File

@@ -0,0 +1,3 @@
mutation RemoveTagMutation($query: String!, $tag: String!) {
tagRemove(query:$query, tag:$tag)
}

2833
web/graphql/schema.json Normal file

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,77 @@
query ShowThreadQuery($threadId: String!) {
thread(threadId: $threadId) {
__typename ... on NewsPost{
threadId
isRead
slug
site
title
body
url
timestamp
# TODO: unread
}
__typename ... on EmailThread{
threadId,
subject
messages {
id
subject
tags
from {
name
addr
photoUrl
}
to {
name
addr
}
cc {
name
addr
}
xOriginalTo {
name
addr
}
deliveredTo {
name
addr
}
timestamp
body {
__typename
... on UnhandledContentType {
contents
contentTree
}
... on PlainText {
contents
contentTree
}
... on Html {
contents
contentTree
}
}
path
attachments {
id
idx
filename
contentType
contentId
size
}
}
}
}
tags {
name
bgColor
fgColor
unread
}
version
}

4
web/graphql/update_schema.sh Executable file
View File

@@ -0,0 +1,4 @@
DEV_HOST=localhost
DEV_PORT=9345
graphql-client introspect-schema http://${DEV_HOST:?}:${DEV_PORT:?}/api/graphql --output schema.json
git diff schema.json

View File

@@ -2,91 +2,24 @@
<html lang="en">
<head>
<meta charset="utf-8">
<meta name="viewport" content="width=device-width, initial-scale=1, shrink-to-fit=no">
<link rel="modulepreload" href="/pkg/package.js" as="script" type="text/javascript">
<link rel="preload" href="/pkg/package_bg.wasm" as="fetch" type="application/wasm" crossorigin="anonymous">
<link rel="stylesheet", href="https://jenil.github.io/bulmaswatch/cyborg/bulmaswatch.min.css">
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/font-awesome/6.3.0/css/all.min.css" integrity="sha512-SzlrxWUlpfuzQ+pcUCosxcglQRNAq/DZjVsC0lE40xsADsfeQoEypE+enwcOiGjk/bSuGGKHEyjSoQ1zVisanQ==" crossorigin="anonymous" referrerpolicy="no-referrer" />
<style>
.message {
padding-left: 0.5em;
}
.body {
background: white;
color: black;
padding-bottom: 1em;
}
.error {
background-color: red;
}
.view-part-text-plain {
white-space: pre-line;
}
iframe {
height: 100%;
width: 100%;
}
.index .from {
width: 200px;
}
.index .subject {
}
.index .date {
white-space: nowrap;
}
.footer {
background-color: #eee;
color: #222;
position: fixed;
bottom: 0;
left: 0;
right: 0;
height: 3em;
padding: 1em;
}
.tag {
margin-right: 2px;
}
.debug ul {
padding-left: 2em;
}
.debug li {
}
.loading {
animation-name: spin;
animation-duration: 1000ms;
animation-iteration-count: infinite;
animation-timing-function: linear;
}
@keyframes spin {
from {
transform:rotate(0deg);
}
to {
transform:rotate(360deg);
}
}
@media (max-width: 768px) {
.section {
padding: 1.5em;
}
}
input, .input {
color: #000;
}
input::placeholder, .input::placeholder{
color: #555;
}
</style>
<meta charset="utf-8">
<meta name="viewport" content="width=device-width, initial-scale=1, shrink-to-fit=no">
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/font-awesome/6.7.2/css/all.min.css"
integrity="sha512-Evv84Mr4kqVGRNSgIGL/F/aIDqQb7xQ2vcrdIwxfjThSH8CSR7PBEakCr51Ck+w+/U6swU2Im1vVX0SVk9ABhg=="
crossorigin="anonymous" referrerpolicy="no-referrer" />
<link rel="icon" href="https://static.xinu.tv/favicon/letterbox.svg" />
<!-- tall thin font for user icon -->
<link rel="preconnect" href="https://fonts.googleapis.com">
<link rel="preconnect" href="https://fonts.gstatic.com" crossorigin>
<link href="https://fonts.googleapis.com/css2?family=Poppins:wght@700&display=swap" rel="stylesheet">
<!-- <link data-trunk rel="css" href="static/site-specific.css" /> -->
<link data-trunk rel="css" href="static/vars.css" />
<link data-trunk rel="tailwind-css" href="./src/tailwind.css" />
<link data-trunk rel="css" href="static/overrides.css" />
</head>
<body>
<section id="app"></section>
<script type="module">
import init from '/pkg/package.js';
init('/pkg/package_bg.wasm');
</script>
<section id="app"></section>
</body>
</html>
</html>

View File

@@ -1,16 +1,14 @@
use seed::Url;
const BASE_URL: &str = "/api";
pub fn refresh() -> String {
format!("{BASE_URL}/refresh")
}
pub fn search(query: &str, page: usize, results_per_page: usize) -> String {
let query = Url::encode_uri_component(query);
format!("{BASE_URL}/search/{query}?page={page}&results_per_page={results_per_page}")
}
pub fn show(tid: &str) -> String {
format!("{BASE_URL}/show/{tid}")
}
pub fn original(message_id: &str) -> String {
format!("{BASE_URL}/original/{message_id}")
pub mod urls {
use seed::Url;
pub fn search(query: &str, page: usize) -> Url {
let query = Url::encode_uri_component(query);
if page > 0 {
Url::new().set_hash_path(["s", &query, &format!("p{page}")])
} else {
Url::new().set_hash_path(["s", &query])
}
}
pub fn thread(tid: &str) -> Url {
Url::new().set_hash_path(["t", tid])
}
}

1
web/src/consts.rs Normal file
View File

@@ -0,0 +1 @@
pub const SEARCH_RESULTS_PER_PAGE: usize = 20;

77
web/src/graphql.rs Normal file
View File

@@ -0,0 +1,77 @@
use gloo_net::{http::Request, Error};
use graphql_client::GraphQLQuery;
use serde::{de::DeserializeOwned, Serialize};
// The paths are relative to the directory where your `Cargo.toml` is located.
// Both json and the GraphQL schema language are supported as sources for the schema
#[derive(GraphQLQuery)]
#[graphql(
schema_path = "graphql/schema.json",
query_path = "graphql/front_page.graphql",
response_derives = "Debug"
)]
pub struct FrontPageQuery;
#[derive(GraphQLQuery)]
#[graphql(
schema_path = "graphql/schema.json",
query_path = "graphql/catchup.graphql",
response_derives = "Debug"
)]
pub struct CatchupQuery;
#[derive(GraphQLQuery)]
#[graphql(
schema_path = "graphql/schema.json",
query_path = "graphql/show_thread.graphql",
response_derives = "Debug"
)]
pub struct ShowThreadQuery;
#[derive(GraphQLQuery)]
#[graphql(
schema_path = "graphql/schema.json",
query_path = "graphql/mark_read.graphql",
response_derives = "Debug"
)]
pub struct MarkReadMutation;
#[derive(GraphQLQuery)]
#[graphql(
schema_path = "graphql/schema.json",
query_path = "graphql/add_tag.graphql",
response_derives = "Debug"
)]
pub struct AddTagMutation;
#[derive(GraphQLQuery)]
#[graphql(
schema_path = "graphql/schema.json",
query_path = "graphql/remove_tag.graphql",
response_derives = "Debug"
)]
pub struct RemoveTagMutation;
#[derive(GraphQLQuery)]
#[graphql(
schema_path = "graphql/schema.json",
query_path = "graphql/refresh.graphql",
response_derives = "Debug"
)]
pub struct RefreshMutation;
pub async fn send_graphql<Body, Resp>(body: Body) -> Result<graphql_client::Response<Resp>, Error>
where
Body: Serialize,
Resp: DeserializeOwned + 'static,
{
use web_sys::RequestMode;
Request::post("/api/graphql/")
.mode(RequestMode::Cors)
.json(&body)?
.send()
.await?
.json()
.await
}

View File

@@ -1,580 +0,0 @@
mod api;
mod nm;
use std::{
collections::hash_map::DefaultHasher,
hash::{Hash, Hasher},
};
use itertools::Itertools;
use log::{debug, error, info, Level};
use notmuch::ThreadSet;
use seed::{prelude::*, *};
use serde::Deserialize;
use wasm_timer::Instant;
const SEARCH_RESULTS_PER_PAGE: usize = 20;
// ------ ------
// Init
// ------ ------
// `init` describes what should happen when your app started.
fn init(url: Url, orders: &mut impl Orders<Msg>) -> Model {
orders
.subscribe(on_url_changed)
.notify(subs::UrlChanged(url.clone()));
Model {
context: Context::None,
query: "".to_string(),
refreshing_state: RefreshingState::None,
}
}
fn on_url_changed(uc: subs::UrlChanged) -> Msg {
let mut url = uc.0;
info!(
"url changed '{}', history {}",
url,
history().length().unwrap_or(0)
);
let hpp = url.remaining_hash_path_parts();
match hpp.as_slice() {
["t", tid] => Msg::ShowRequest(tid.to_string()),
["s", query] => {
let query = Url::decode_uri_component(query).unwrap_or("".to_string());
Msg::SearchRequest {
query,
page: 0,
results_per_page: SEARCH_RESULTS_PER_PAGE,
}
}
["s", query, page] => {
let query = Url::decode_uri_component(query).unwrap_or("".to_string());
let page = page[1..].parse().unwrap_or(0);
Msg::SearchRequest {
query,
page,
results_per_page: SEARCH_RESULTS_PER_PAGE,
}
}
p => {
if !p.is_empty() {
info!("Unhandled path '{p:?}'");
}
Msg::SearchRequest {
query: "".to_string(),
page: 0,
results_per_page: SEARCH_RESULTS_PER_PAGE,
}
}
}
}
mod urls {
use seed::Url;
pub fn search(query: &str, page: usize) -> Url {
let query = Url::encode_uri_component(query);
if page > 0 {
Url::new().set_hash_path(["s", &query, &format!("p{page}")])
} else {
Url::new().set_hash_path(["s", &query])
}
}
pub fn thread(tid: &str) -> Url {
Url::new().set_hash_path(["t", tid])
}
}
// ------ ------
// Model
// ------ ------
enum Context {
None,
Search(shared::SearchResult),
Thread(Vec<shared::Message>),
}
// `Model` describes our app state.
struct Model {
query: String,
context: Context,
refreshing_state: RefreshingState,
}
#[derive(Debug, PartialEq)]
enum RefreshingState {
None,
Loading,
Error(String),
}
// ------ ------
// Update
// ------ ------
// (Remove the line below once any of your `Msg` variants doesn't implement `Copy`.)
// `Msg` describes the different events you can modify state with.
pub enum Msg {
Noop,
RefreshStart,
RefreshDone(Option<FetchError>),
SearchRequest {
query: String,
page: usize,
results_per_page: usize,
},
SearchResult(fetch::Result<shared::SearchResult>),
ShowRequest(String),
ShowResult(fetch::Result<Vec<shared::Message>>),
NextPage,
PreviousPage,
}
// `update` describes how to handle each `Msg`.
fn update(msg: Msg, model: &mut Model, orders: &mut impl Orders<Msg>) {
match msg {
Msg::Noop => {}
Msg::RefreshStart => {
model.refreshing_state = RefreshingState::Loading;
orders.perform_cmd(async move { Msg::RefreshDone(refresh_request().await.err()) });
}
Msg::RefreshDone(err) => {
model.refreshing_state = if let Some(err) = err {
RefreshingState::Error(format!("{:?}", err))
} else {
// If looking at search page, refresh the search to view update on the server side.
if let Context::Search(sr) = &model.context {
let query = sr.query.clone();
let page = sr.page;
let results_per_page = sr.results_per_page;
orders.perform_cmd(async move {
Msg::SearchResult(search_request(&query, page, results_per_page).await)
});
}
RefreshingState::None
};
}
Msg::SearchRequest {
query,
page,
results_per_page,
} => {
info!("searching for '{query}' pg {page} # / pg {results_per_page}");
model.query = query.clone();
orders.skip().perform_cmd(async move {
Msg::SearchResult(search_request(&query, page, results_per_page).await)
});
}
Msg::SearchResult(Ok(response_data)) => {
debug!("fetch ok {:#?}", response_data);
model.context = Context::Search(response_data);
}
Msg::SearchResult(Err(fetch_error)) => {
error!("fetch failed {:?}", fetch_error);
}
Msg::ShowRequest(tid) => {
orders
.skip()
.perform_cmd(async move { Msg::ShowResult(show_request(&tid).await) });
}
Msg::ShowResult(Ok(response_data)) => {
debug!("fetch ok {:#?}", response_data);
model.context = Context::Thread(response_data);
}
Msg::ShowResult(Err(fetch_error)) => {
error!("fetch failed {:?}", fetch_error);
}
Msg::NextPage => {
match &model.context {
Context::Search(sr) => {
orders.request_url(urls::search(&sr.query, sr.page + 1));
}
Context::Thread(_) => (), // do nothing (yet?)
Context::None => (), // do nothing (yet?)
};
}
Msg::PreviousPage => {
match &model.context {
Context::Search(sr) => {
orders.request_url(urls::search(&sr.query, sr.page.saturating_sub(1)));
}
Context::Thread(_) => (), // do nothing (yet?)
Context::None => (), // do nothing (yet?)
};
}
}
}
pub async fn show_request(tid: &str) -> fetch::Result<Vec<shared::Message>> {
let b = Request::new(api::show(tid))
.method(Method::Get)
.fetch()
.await?
.check_status()?
.bytes()
.await?;
let mut deserializer = serde_json::Deserializer::from_slice(&b);
deserializer.disable_recursion_limit();
Ok(Vec::<shared::Message>::deserialize(&mut deserializer)
.map_err(|_| FetchError::JsonError(fetch::JsonError::Serde(JsValue::NULL)))?)
}
async fn search_request(
query: &str,
page: usize,
results_per_page: usize,
) -> fetch::Result<shared::SearchResult> {
Request::new(api::search(query, page, results_per_page))
.method(Method::Get)
.fetch()
.await?
.check_status()?
.json()
.await
}
async fn refresh_request() -> fetch::Result<()> {
let t = Request::new(api::refresh())
.method(Method::Get)
.fetch()
.await?
.check_status()?
.text()
.await?;
info!("refresh {t}");
Ok(())
}
// ------ ------
// View
// ------ ------
fn set_title(title: &str) {
seed::document().set_title(&format!("lb: {}", title));
}
fn tags_chiclet(tags: &[String], is_mobile: bool) -> impl Iterator<Item = Node<Msg>> + '_ {
tags.iter().map(move |tag| {
let mut hasher = DefaultHasher::new();
tag.hash(&mut hasher);
let hex = format!("#{:06x}", hasher.finish() % (1 << 24));
let style = style! {St::BackgroundColor=>hex};
let classes = C!["tag", IF!(is_mobile => "is-small")];
let tag = tag.clone();
a![
attrs! {
At::Href => urls::search(&format!("tag:{tag}"), 0)
},
match tag.as_str() {
"attachment" => span![classes, style, "📎"],
"replied" => span![classes, style, i![C!["fa-solid", "fa-reply"]]],
_ => span![classes, style, &tag],
},
ev(Ev::Click, move |_| Msg::SearchRequest {
query: format!("tag:{tag}"),
page: 0,
results_per_page: SEARCH_RESULTS_PER_PAGE,
})
]
})
}
fn pretty_authors(authors: &str) -> impl Iterator<Item = Node<Msg>> + '_ {
let one_person = authors.matches(',').count() == 0;
let authors = authors.split(',');
Itertools::intersperse(
authors.filter_map(move |author| {
if one_person {
return Some(span![
attrs! {
At::Title => author.trim()},
author
]);
}
author.split_whitespace().nth(0).map(|first| {
span![
attrs! {
At::Title => author.trim()},
first
]
})
}),
span![", "],
)
}
fn view_mobile_search_results(query: &str, search_results: &shared::SearchResult) -> Node<Msg> {
if query.is_empty() {
set_title("all mail");
} else {
set_title(query);
}
let summaries = &search_results.summary.0;
let rows = summaries.iter().map(|r| {
/*
let tid = r.thread.clone();
tr![
td![
C!["from"],
pretty_authors(&r.authors),
IF!(r.total>1 => small![" ", r.total.to_string()]),
],
td![C!["subject"], tags_chiclet(&r.tags), " ", &r.subject],
td![C!["date"], &r.date_relative],
ev(Ev::Click, move |_| Msg::ShowRequest(tid)),
]
*/
let tid = r.thread.clone();
div![
div![
C!["subject"],
&r.subject,
ev(Ev::Click, move |_| Msg::ShowRequest(tid)),
],
div![
span![C!["from"], pretty_authors(&r.authors)],
span![C!["tags"], tags_chiclet(&r.tags, true)],
],
span![C!["date"], &r.date_relative],
hr![],
]
});
let first = search_results.page * search_results.results_per_page;
div![
h1!["Search results"],
view_search_pager(first, summaries.len(), search_results.total),
rows,
view_search_pager(first, summaries.len(), search_results.total)
]
}
fn view_search_results(query: &str, search_results: &shared::SearchResult) -> Node<Msg> {
if query.is_empty() {
set_title("all mail");
} else {
set_title(query);
}
let summaries = &search_results.summary.0;
let rows = summaries.iter().map(|r| {
let tid = r.thread.clone();
tr![
td![
C!["from"],
pretty_authors(&r.authors),
IF!(r.total>1 => small![" ", r.total.to_string()]),
],
td![
C!["subject"],
tags_chiclet(&r.tags, false),
" ",
a![
C!["has-text-light"],
attrs! {
At::Href => urls::thread(&tid)
},
&r.subject,
]
],
td![C!["date"], &r.date_relative]
]
});
let first = search_results.page * search_results.results_per_page;
div![
view_search_pager(first, summaries.len(), search_results.total),
table![
C![
"table",
"index",
"is-fullwidth",
"is-hoverable",
"is-narrow",
"is-striped",
],
thead![tr![
th![C!["from"], "From"],
th![C!["subject"], "Subject"],
th![C!["date"], "Date"]
]],
tbody![rows]
],
view_search_pager(first, summaries.len(), search_results.total)
]
}
fn view_search_pager(start: usize, count: usize, total: usize) -> Node<Msg> {
let is_first = start <= 0;
let is_last = (start + SEARCH_RESULTS_PER_PAGE) >= total;
nav![
C!["pagination"],
a![
C![
"pagination-previous",
"button",
IF!(is_first => "is-static"),
IF!(is_first => "is-info"),
],
"<",
ev(Ev::Click, |_| Msg::PreviousPage)
],
a![
C!["pagination-next", "button", IF!(is_last => "is-static")],
IF!(is_last => attrs!{ At::Disabled=>true }),
">",
ev(Ev::Click, |_| Msg::NextPage)
],
ul![
C!["pagination-list"],
li![format!("{} - {} of {}", start, start + count, total)],
],
]
}
fn view_header(query: &str, refresh_request: &RefreshingState) -> Node<Msg> {
let is_loading = refresh_request == &RefreshingState::Loading;
let is_error = if let RefreshingState::Error(err) = refresh_request {
error!("Failed to refresh: {err:?}");
true
} else {
false
};
let query = Url::decode_uri_component(query).unwrap_or("".to_string());
nav![
C!["navbar"],
attrs! {At::Role=>"navigation"},
div![
C!["navbar-start"],
a![
C!["navbar-item", "button", IF![is_error => "is-danger"]],
span![i![C![
"fa-solid",
"fa-arrow-rotate-right",
"refresh",
IF![is_loading => "loading"],
]]],
ev(Ev::Click, |_| Msg::RefreshStart),
],
a![
C!["navbar-item", "button"],
attrs! {
At::Href => urls::search("is:unread", 0)
},
"Unread",
],
a![
C!["navbar-item", "button"],
attrs! {
At::Href => urls::search("", 0)
},
"All",
],
input![
C!["navbar-item", "input"],
attrs! {
At::Placeholder => "Search";
At::AutoFocus => true.as_at_value();
At::Value => query,
},
input_ev(Ev::Input, |q| Msg::SearchRequest {
query: Url::encode_uri_component(q),
page: 0,
results_per_page: SEARCH_RESULTS_PER_PAGE,
}),
// Resend search on enter.
keyboard_ev(Ev::KeyUp, move |e| if e.key_code() == 0x0d {
Msg::SearchRequest {
query: Url::encode_uri_component(query),
page: 0,
results_per_page: SEARCH_RESULTS_PER_PAGE,
}
} else {
Msg::Noop
}),
]
]
]
}
fn view_footer(render_time_ms: u128) -> Node<Msg> {
footer![
C!["footer"],
div![
C!["content", "has-text-right", "is-size-7"],
format!("Render time {} ms", render_time_ms)
]
]
}
fn view_thread(messages: &[shared::Message]) -> Node<Msg> {
div![
"MESSAGES GO HERE",
ol![messages.iter().map(|msg| li![format!("{:?}", msg)])]
]
}
fn view_desktop(model: &Model) -> Node<Msg> {
let content = match &model.context {
Context::None => div![h1!["Loading"]],
Context::Thread(thread_set) => view_thread(thread_set),
Context::Search(search_results) => view_search_results(&model.query, search_results),
};
div![
view_header(&model.query, &model.refreshing_state),
section![C!["section"], div![C!["container"], content],]
]
}
fn view_mobile(model: &Model) -> Node<Msg> {
let content = match &model.context {
Context::None => div![h1!["Loading"]],
Context::Thread(thread_set) => view_thread(thread_set),
Context::Search(search_results) => view_mobile_search_results(&model.query, search_results),
};
div![
view_header(&model.query, &model.refreshing_state),
section![C!["section"], div![C!["content"], content],]
]
}
// `view` describes what to display.
fn view(model: &Model) -> Node<Msg> {
info!("refreshing {:?}", model.refreshing_state);
let is_mobile = seed::window()
.match_media("(max-width: 768px)")
.expect("failed media query")
.map(|mql| mql.matches())
.unwrap_or(false);
let start = Instant::now();
info!("view called");
div![
if is_mobile {
view_mobile(model)
} else {
view_desktop(model)
},
view_footer(start.elapsed().as_millis())
]
}
// ------ ------
// Start
// ------ ------
// (This function is invoked by `init` function in `index.html`.)
#[wasm_bindgen(start)]
pub fn start() {
// This provides better error messages in debug mode.
// It's disabled in release mode so it doesn't bloat up the file size.
#[cfg(debug_assertions)]
console_error_panic_hook::set_once();
let lvl = Level::Info;
console_log::init_with_level(lvl).expect("failed to initialize console logging");
// Mount the `app` to the element with the `id` "app".
App::start("app", init, update, view);
}

31
web/src/main.rs Normal file
View File

@@ -0,0 +1,31 @@
// (Lines like the one below ignore selected Clippy rules
// - it's useful when you want to check your code with `cargo make verify`
// but some rules are too "annoying" or are not applicable for your case.)
#![allow(clippy::wildcard_imports)]
// Until https://github.com/rust-lang/rust/issues/138762 is addressed in dependencies
#![allow(wasm_c_abi)]
use log::Level;
use seed::App;
mod api;
mod consts;
mod graphql;
mod state;
mod view;
mod websocket;
fn main() {
// This provides better error messages in debug mode.
// It's disabled in release mode so it doesn't bloat up the file size.
#[cfg(debug_assertions)]
console_error_panic_hook::set_once();
#[cfg(debug_assertions)]
let lvl = Level::Debug;
#[cfg(not(debug_assertions))]
let lvl = Level::Info;
console_log::init_with_level(lvl).expect("failed to initialize console logging");
// Mount the `app` to the element with the `id` "app".
App::start("app", state::init, state::update, view::view);
}

View File

@@ -1,193 +0,0 @@
use notmuch::{Content, Part, Thread, ThreadNode, ThreadSet};
use seed::{prelude::*, *};
use serde::de::Deserialize;
use crate::{api, set_title, Msg};
pub async fn show_request(tid: &str) -> fetch::Result<ThreadSet> {
let b = Request::new(api::show(tid))
.method(Method::Get)
.fetch()
.await?
.check_status()?
.bytes()
.await?;
let mut deserializer = serde_json::Deserializer::from_slice(&b);
deserializer.disable_recursion_limit();
Ok(ThreadSet::deserialize(&mut deserializer)
.map_err(|_| FetchError::JsonError(fetch::JsonError::Serde(JsValue::NULL)))?)
}
pub fn view_thread(thread_set: &ThreadSet) -> Node<Msg> {
assert_eq!(thread_set.0.len(), 1);
let thread = &thread_set.0[0];
assert_eq!(thread.0.len(), 1);
let thread_node = &thread.0[0];
let subject = first_subject(&thread_node).unwrap_or("<No subject>".to_string());
set_title(&subject);
div![
h1![subject],
a![
attrs! {At::Href=>api::original(&thread_node.0.as_ref().expect("message missing").id)},
"Original"
],
view_message(&thread_node),
div![
C!["debug"],
"Add zippy for debug dump",
view_debug_thread_set(thread_set)
] /* pre![format!("Thread: {:#?}", thread_set).replace(" ", " ")] */
]
}
// <subject>
// <tags>
//
// <from1> <date>
// <to1>
// <content1>
// <zippy>
// <children1>
// </zippy>
//
// <from2> <date>
// <to2>
// <body2>
fn view_message(thread: &ThreadNode) -> Node<Msg> {
let message = thread.0.as_ref().expect("ThreadNode missing Message");
let children = &thread.1;
div![
C!["message"],
/* TODO(wathiede): collect all the tags and show them here. */
/* TODO(wathiede): collect all the attachments from all the subparts */
div![C!["header"], "From: ", &message.headers.from],
div![C!["header"], "Date: ", &message.headers.date],
div![C!["header"], "To: ", &message.headers.to],
hr![],
div![
C!["body"],
match &message.body {
Some(body) => view_body(body.as_slice()),
None => div!["<no body>"],
},
],
children.iter().map(view_message)
]
}
fn view_body(body: &[Part]) -> Node<Msg> {
div![body.iter().map(view_part)]
}
fn view_text_plain(content: &Option<Content>) -> Node<Msg> {
match &content {
Some(Content::String(content)) => p![C!["view-part-text-plain"], content],
_ => div![
C!["error"],
format!("Unhandled content enum for text/plain"),
],
}
}
fn view_part(part: &Part) -> Node<Msg> {
match part.content_type.as_str() {
"text/plain" => view_text_plain(&part.content),
"text/html" => {
if let Some(Content::String(html)) = &part.content {
let inliner = css_inline::CSSInliner::options()
.load_remote_stylesheets(false)
.remove_style_tags(true)
.build();
let inlined = inliner.inline(html).expect("failed to inline CSS");
return div![C!["view-part-text-html"], div!["TEST"], raw![&inlined]];
} else {
div![
C!["error"],
format!("Unhandled content enum for multipart/mixed"),
]
}
}
// https://en.wikipedia.org/wiki/MIME#alternative
// RFC1341 states: In general, user agents that compose multipart/alternative entities
// should place the body parts in increasing order of preference, that is, with the
// preferred format last.
"multipart/alternative" => {
if let Some(Content::Multipart(parts)) = &part.content {
for part in parts.iter().rev() {
if part.content_type == "text/html" {
if let Some(Content::String(html)) = &part.content {
let inliner = css_inline::CSSInliner::options()
.load_remote_stylesheets(false)
.remove_style_tags(true)
.build();
let inlined = inliner.inline(html).expect("failed to inline CSS");
return div![Node::from_html(None, &inlined)];
}
}
if part.content_type == "text/plain" {
return view_text_plain(&part.content);
}
}
div!["No known multipart/alternative parts"]
} else {
div![
C!["error"],
format!("multipart/alternative with non-multipart content"),
]
}
}
"multipart/mixed" => match &part.content {
Some(Content::Multipart(parts)) => div![parts.iter().map(view_part)],
_ => div![
C!["error"],
format!("Unhandled content enum for multipart/mixed"),
],
},
_ => div![
C!["error"],
format!("Unhandled content type: {}", part.content_type)
],
}
}
fn first_subject(thread: &ThreadNode) -> Option<String> {
if let Some(msg) = &thread.0 {
return Some(msg.headers.subject.clone());
} else {
for tn in &thread.1 {
if let Some(s) = first_subject(&tn) {
return Some(s);
}
}
}
None
}
fn view_debug_thread_set(thread_set: &ThreadSet) -> Node<Msg> {
ul![thread_set
.0
.iter()
.enumerate()
.map(|(i, t)| { li!["t", i, ": ", view_debug_thread(t),] })]
}
fn view_debug_thread(thread: &Thread) -> Node<Msg> {
ul![thread
.0
.iter()
.enumerate()
.map(|(i, tn)| { li!["tn", i, ": ", view_debug_thread_node(tn),] })]
}
fn view_debug_thread_node(thread_node: &ThreadNode) -> Node<Msg> {
ul![
IF!(thread_node.0.is_some()=>li!["tn id:", &thread_node.0.as_ref().unwrap().id]),
thread_node.1.iter().enumerate().map(|(i, tn)| li![
"tn",
i,
": ",
view_debug_thread_node(tn)
])
]
}

844
web/src/state.rs Normal file
View File

@@ -0,0 +1,844 @@
use std::collections::HashSet;
use graphql_client::GraphQLQuery;
use letterbox_shared::WebsocketMessage;
use log::{debug, error, info, warn};
use seed::{prelude::*, *};
use thiserror::Error;
use web_sys::HtmlElement;
use crate::{
api::urls,
consts::SEARCH_RESULTS_PER_PAGE,
graphql,
graphql::{front_page_query::*, send_graphql, show_thread_query::*},
websocket,
};
/// Used to fake the unread string while in development
pub fn unread_query() -> &'static str {
let host = seed::window()
.location()
.host()
.expect("failed to get host");
if host.starts_with("6758.") {
return "tag:letterbox";
}
"is:unread"
}
// `init` describes what should happen when your app started.
pub fn init(url: Url, orders: &mut impl Orders<Msg>) -> Model {
let version = letterbox_shared::build_version(bi);
info!("Build Info: {}", version);
// Disable restoring to scroll position when navigating
window()
.history()
.expect("couldn't get history")
.set_scroll_restoration(web_sys::ScrollRestoration::Manual)
.expect("failed to set scroll restoration to manual");
if url.hash().is_none() {
orders.request_url(urls::search(unread_query(), 0));
} else {
orders.request_url(url.clone());
};
// TODO(wathiede): only do this while viewing the index? Or maybe add a new message that force
// 'notmuch new' on the server periodically?
//orders.stream(streams::interval(30_000, || Msg::RefreshStart));
orders.subscribe(Msg::OnUrlChanged);
orders.stream(streams::window_event(Ev::Scroll, |_| Msg::WindowScrolled));
build_info::build_info!(fn bi);
Model {
context: Context::None,
query: "".to_string(),
refreshing_state: RefreshingState::None,
tags: None,
read_completion_ratio: 0.,
content_el: ElRef::<HtmlElement>::default(),
versions: Version {
client: version,
server: None,
},
catchup: None,
last_url: Url::current(),
websocket: websocket::init("/api/ws", &mut orders.proxy(Msg::WebSocket)),
}
}
fn on_url_changed(old: &Url, mut new: Url) -> Msg {
let did_change = *old != new;
let mut messages = Vec::new();
if did_change {
messages.push(Msg::ScrollToTop)
}
info!(
"url changed\nold '{old}'\nnew '{new}', history {}",
history().length().unwrap_or(0)
);
let hpp = new.remaining_hash_path_parts();
let msg = match hpp.as_slice() {
["t", tid] => Msg::ShowThreadRequest {
thread_id: tid.to_string(),
},
["s", query] => {
let query = Url::decode_uri_component(query).unwrap_or("".to_string());
Msg::FrontPageRequest {
query,
after: None,
before: None,
first: None,
last: None,
}
}
["s", query, page] => {
let query = Url::decode_uri_component(query).unwrap_or("".to_string());
let page = page[1..].parse().unwrap_or(0);
Msg::FrontPageRequest {
query,
after: Some(page.to_string()),
before: None,
first: None,
last: None,
}
}
p => {
if !p.is_empty() {
info!("Unhandled path '{p:?}'");
}
Msg::FrontPageRequest {
query: "".to_string(),
after: None,
before: None,
first: None,
last: None,
}
}
};
messages.push(msg);
Msg::MultiMsg(messages)
}
// `update` describes how to handle each `Msg`.
pub fn update(msg: Msg, model: &mut Model, orders: &mut impl Orders<Msg>) {
info!("update({})", msg);
match msg {
Msg::Noop => {}
Msg::RefreshStart => {
model.refreshing_state = RefreshingState::Loading;
orders.perform_cmd(async move {
Msg::RefreshDone(
send_graphql::<_, graphql::refresh_mutation::ResponseData>(
graphql::RefreshMutation::build_query(
graphql::refresh_mutation::Variables {},
),
)
.await
.err(),
)
});
}
Msg::RefreshDone(err) => {
model.refreshing_state = if let Some(err) = err {
RefreshingState::Error(format!("{:?}", err))
} else {
RefreshingState::None
};
orders.perform_cmd(async move { Msg::Refresh });
}
Msg::Refresh => {
orders.request_url(Url::current());
}
Msg::Reload => {
window()
.location()
.reload()
.expect("failed to reload window");
}
Msg::OnUrlChanged(new_url) => {
orders.send_msg(on_url_changed(&model.last_url, new_url.0.clone()));
model.last_url = new_url.0;
}
Msg::NextPage => {
match &model.context {
Context::SearchResult { query, pager, .. } => {
let query = query.to_string();
let after = pager.end_cursor.clone();
orders.perform_cmd(async move {
Msg::FrontPageRequest {
query,
after,
before: None,
first: Some(SEARCH_RESULTS_PER_PAGE as i64),
last: None,
}
});
}
Context::ThreadResult { .. } => (), // do nothing (yet?)
Context::None => (), // do nothing (yet?)
};
}
Msg::PreviousPage => {
match &model.context {
Context::SearchResult { query, pager, .. } => {
let query = query.to_string();
let before = pager.start_cursor.clone();
orders.perform_cmd(async move {
Msg::FrontPageRequest {
query,
after: None,
before,
first: None,
last: Some(SEARCH_RESULTS_PER_PAGE as i64),
}
});
}
Context::ThreadResult { .. } => (), // do nothing (yet?)
Context::None => (), // do nothing (yet?)
};
}
Msg::GoToSearchResults => {
orders.send_msg(Msg::SearchQuery(model.query.clone()));
}
Msg::UpdateQuery(query) => model.query = query,
Msg::SearchQuery(query) => {
orders.request_url(urls::search(&query, 0));
}
Msg::SetUnread(query, unread) => {
orders.skip().perform_cmd(async move {
let res: Result<
graphql_client::Response<graphql::mark_read_mutation::ResponseData>,
gloo_net::Error,
> = send_graphql(graphql::MarkReadMutation::build_query(
graphql::mark_read_mutation::Variables {
query: query.clone(),
unread,
},
))
.await;
if let Err(e) = res {
error!("Failed to set read for {query} to {unread}: {e}");
}
Msg::Refresh
});
}
Msg::AddTag(query, tag) => {
orders.skip().perform_cmd(async move {
let res: Result<
graphql_client::Response<graphql::add_tag_mutation::ResponseData>,
gloo_net::Error,
> = send_graphql(graphql::AddTagMutation::build_query(
graphql::add_tag_mutation::Variables {
query: query.clone(),
tag: tag.clone(),
},
))
.await;
if let Err(e) = res {
error!("Failed to add tag {tag} to {query}: {e}");
}
Msg::GoToSearchResults
});
}
Msg::RemoveTag(query, tag) => {
orders.skip().perform_cmd(async move {
let res: Result<
graphql_client::Response<graphql::remove_tag_mutation::ResponseData>,
gloo_net::Error,
> = send_graphql(graphql::RemoveTagMutation::build_query(
graphql::remove_tag_mutation::Variables {
query: query.clone(),
tag: tag.clone(),
},
))
.await;
if let Err(e) = res {
error!("Failed to remove tag {tag} to {query}: {e}");
}
// TODO: reconsider this behavior
Msg::GoToSearchResults
});
}
Msg::FrontPageRequest {
query,
after,
before,
first,
last,
} => {
let (after, before, first, last) = match (after.as_ref(), before.as_ref(), first, last)
{
// If no pagination set, set reasonable defaults
(None, None, None, None) => {
(None, None, Some(SEARCH_RESULTS_PER_PAGE as i64), None)
}
_ => (after, before, first, last),
};
model.query = query.clone();
orders.skip().perform_cmd(async move {
Msg::FrontPageResult(
send_graphql(graphql::FrontPageQuery::build_query(
graphql::front_page_query::Variables {
query,
after,
before,
first,
last,
},
))
.await,
)
});
}
Msg::FrontPageResult(Err(e)) => {
error!("error FrontPageResult: {e:?}");
}
Msg::FrontPageResult(Ok(graphql_client::Response {
data: None,
errors: None,
..
})) => {
error!("FrontPageResult no data or errors, should not happen");
}
Msg::FrontPageResult(Ok(graphql_client::Response {
data: None,
errors: Some(e),
..
})) => {
error!("FrontPageResult error: {e:?}");
}
Msg::FrontPageResult(Ok(graphql_client::Response {
data: Some(data), ..
})) => {
model.tags = Some(
data.tags
.into_iter()
.map(|t| Tag {
name: t.name,
bg_color: t.bg_color,
unread: t.unread,
})
.collect(),
);
let selected_threads = 'context: {
if let Context::SearchResult {
results,
selected_threads,
..
} = &model.context
{
let old: HashSet<_> = results.iter().map(|n| &n.thread).collect();
let new: HashSet<_> = data.search.nodes.iter().map(|n| &n.thread).collect();
if old == new {
break 'context selected_threads.clone();
}
}
HashSet::new()
};
model.context = Context::SearchResult {
query: model.query.clone(),
results: data.search.nodes,
count: data.count as usize,
pager: data.search.page_info,
selected_threads,
};
orders.send_msg(Msg::UpdateServerVersion(data.version));
// Generate signal so progress bar is reset
orders.send_msg(Msg::WindowScrolled);
}
Msg::ShowThreadRequest { thread_id } => {
orders.skip().perform_cmd(async move {
Msg::ShowThreadResult(
send_graphql(graphql::ShowThreadQuery::build_query(
graphql::show_thread_query::Variables { thread_id },
))
.await,
)
});
}
Msg::ShowThreadResult(Ok(graphql_client::Response {
data: Some(data), ..
})) => {
model.tags = Some(
data.tags
.into_iter()
.map(|t| Tag {
name: t.name,
bg_color: t.bg_color,
unread: t.unread,
})
.collect(),
);
match &data.thread {
graphql::show_thread_query::ShowThreadQueryThread::EmailThread(
ShowThreadQueryThreadOnEmailThread { messages, .. },
) => {
let mut open_messages: HashSet<_> = messages
.iter()
.filter(|msg| msg.tags.iter().any(|t| t == "unread"))
.map(|msg| msg.id.clone())
.collect();
if open_messages.is_empty() {
open_messages = messages.iter().map(|msg| msg.id.clone()).collect();
}
model.context = Context::ThreadResult {
thread: data.thread,
open_messages,
};
}
graphql::show_thread_query::ShowThreadQueryThread::NewsPost(..) => {
model.context = Context::ThreadResult {
thread: data.thread,
open_messages: HashSet::new(),
};
}
}
orders.send_msg(Msg::UpdateServerVersion(data.version));
// Generate signal so progress bar is reset
orders.send_msg(Msg::WindowScrolled);
}
Msg::ShowThreadResult(bad) => {
error!("show_thread_query error: {bad:#?}");
}
Msg::CatchupRequest { query } => {
orders.perform_cmd(async move {
Msg::CatchupResult(
send_graphql::<_, graphql::catchup_query::ResponseData>(
graphql::CatchupQuery::build_query(graphql::catchup_query::Variables {
query,
}),
)
.await,
)
});
}
Msg::CatchupResult(Ok(graphql_client::Response {
data: Some(data), ..
})) => {
let items = data.catchup;
if items.is_empty() {
orders.send_msg(Msg::GoToSearchResults);
model.catchup = None;
} else {
orders.request_url(urls::thread(&items[0]));
model.catchup = Some(Catchup {
items: items
.into_iter()
.map(|id| CatchupItem { id, seen: false })
.collect(),
});
}
}
Msg::CatchupResult(bad) => {
error!("catchup_query error: {bad:#?}");
}
Msg::SelectionSetNone => {
if let Context::SearchResult {
selected_threads, ..
} = &mut model.context
{
*selected_threads = HashSet::new();
}
}
Msg::SelectionSetAll => {
if let Context::SearchResult {
results,
selected_threads,
..
} = &mut model.context
{
*selected_threads = results.iter().map(|node| node.thread.clone()).collect();
}
}
Msg::SelectionAddTag(tag) => {
if let Context::SearchResult {
selected_threads, ..
} = &mut model.context
{
let threads = selected_threads
.iter()
.map(|tid| tid.to_string())
.collect::<Vec<_>>()
.join(" ");
orders
.skip()
.perform_cmd(async move { Msg::AddTag(threads, tag) });
}
}
Msg::SelectionRemoveTag(tag) => {
if let Context::SearchResult {
selected_threads, ..
} = &mut model.context
{
let threads = selected_threads
.iter()
.map(|tid| tid.to_string())
.collect::<Vec<_>>()
.join(" ");
orders
.skip()
.perform_cmd(async move { Msg::RemoveTag(threads, tag) });
}
}
Msg::SelectionMarkAsRead => {
if let Context::SearchResult {
selected_threads, ..
} = &mut model.context
{
let threads = selected_threads
.iter()
.map(|tid| tid.to_string())
.collect::<Vec<_>>()
.join(" ");
orders
.skip()
.perform_cmd(async move { Msg::SetUnread(threads, false) });
}
}
Msg::SelectionMarkAsUnread => {
if let Context::SearchResult {
selected_threads, ..
} = &mut model.context
{
let threads = selected_threads
.iter()
.map(|tid| tid.to_string())
.collect::<Vec<_>>()
.join(" ");
orders
.skip()
.perform_cmd(async move { Msg::SetUnread(threads, true) });
}
}
Msg::SelectionAddThread(tid) => {
if let Context::SearchResult {
selected_threads, ..
} = &mut model.context
{
selected_threads.insert(tid);
}
}
Msg::SelectionRemoveThread(tid) => {
if let Context::SearchResult {
selected_threads, ..
} = &mut model.context
{
selected_threads.remove(&tid);
}
}
Msg::MessageCollapse(id) => {
if let Context::ThreadResult { open_messages, .. } = &mut model.context {
open_messages.remove(&id);
}
}
Msg::MessageExpand(id) => {
if let Context::ThreadResult { open_messages, .. } = &mut model.context {
open_messages.insert(id);
}
}
Msg::MultiMsg(msgs) => msgs.into_iter().for_each(|msg| update(msg, model, orders)),
Msg::CopyToClipboard(text) => {
let clipboard = seed::window().navigator().clipboard();
orders.perform_cmd(async move {
wasm_bindgen_futures::JsFuture::from(clipboard.write_text(&text))
.await
.expect("failed to copy to clipboard");
});
}
Msg::ScrollToTop => {
info!("scrolling to the top");
web_sys::window().unwrap().scroll_to_with_x_and_y(0., 0.);
}
Msg::WindowScrolled => {
// TODO: model.content_el doesn't go to None like it should when a DOM is recreated and the refrenced element goes away
if let Some(el) = model.content_el.get() {
let ih = window()
.inner_height()
.expect("window height")
.unchecked_into::<js_sys::Number>()
.value_of();
let r = el.get_bounding_client_rect();
if r.height() < ih {
// The whole content fits in the window, no scrollbar
orders.send_msg(Msg::SetProgress(0.));
return;
}
let end: f64 = r.height() - ih;
if end < 0. {
orders.send_msg(Msg::SetProgress(0.));
return;
}
// Flip Y, normally it's 0-point when the top of the content hits the top of the
// screen and goes negative from there.
let y = -r.y();
let ratio: f64 = (y / end).max(0.);
debug!(
"WindowScrolled ih {ih} end {end} ratio {ratio:.02} {}x{} @ {},{}",
r.width(),
r.height(),
r.x(),
r.y()
);
orders.send_msg(Msg::SetProgress(ratio));
} else {
orders.send_msg(Msg::SetProgress(0.));
}
}
Msg::SetProgress(ratio) => {
model.read_completion_ratio = ratio;
}
Msg::UpdateServerVersion(version) => {
// Only git versions contain dash, don't autoreload there
if !version.contains('-') && version != model.versions.client {
warn!(
"Server ({}) and client ({}) version mismatch, reloading",
version, model.versions.client
);
orders.send_msg(Msg::Reload);
}
model.versions.server = Some(version);
}
Msg::CatchupStart => {
let query = if model.query.contains("is:unread") {
model.query.to_string()
} else {
format!("{} is:unread", model.query)
};
info!("starting catchup mode w/ {}", query);
orders.send_msg(Msg::ScrollToTop);
orders.send_msg(Msg::CatchupRequest { query });
}
Msg::CatchupKeepUnread => {
orders.send_msg(Msg::CatchupNext);
}
Msg::CatchupMarkAsRead => {
if let Some(thread_id) = current_thread_id(&model.context) {
orders.send_msg(Msg::SetUnread(thread_id, false));
};
orders.send_msg(Msg::CatchupNext);
}
Msg::CatchupNext => {
orders.send_msg(Msg::ScrollToTop);
let Some(catchup) = &mut model.catchup else {
orders.send_msg(Msg::GoToSearchResults);
return;
};
let Some(thread_id) = current_thread_id(&model.context) else {
return;
};
let Some(idx) = catchup
.items
.iter()
.inspect(|i| info!("i {i:?} thread_id {thread_id}"))
.position(|i| i.id == thread_id)
else {
// All items have been seen
orders.send_msg(Msg::CatchupExit);
orders.send_msg(Msg::GoToSearchResults);
return;
};
catchup.items[idx].seen = true;
if idx < catchup.items.len() - 1 {
// Reached last item
orders.request_url(urls::thread(&catchup.items[idx + 1].id));
return;
} else {
orders.send_msg(Msg::CatchupExit);
orders.send_msg(Msg::GoToSearchResults);
return;
};
}
Msg::CatchupExit => {
orders.send_msg(Msg::ScrollToTop);
model.catchup = None;
}
Msg::WebSocket(ws) => {
websocket::update(ws, &mut model.websocket, &mut orders.proxy(Msg::WebSocket));
while let Some(msg) = model.websocket.updates.pop_front() {
orders.send_msg(Msg::WebsocketMessage(msg));
}
}
Msg::WebsocketMessage(msg) => {
match msg {
WebsocketMessage::RefreshMessages => orders.send_msg(Msg::Refresh),
};
}
}
}
fn current_thread_id(context: &Context) -> Option<String> {
match context {
Context::ThreadResult {
thread:
ShowThreadQueryThread::EmailThread(ShowThreadQueryThreadOnEmailThread {
thread_id, ..
}),
..
} => Some(thread_id.clone()),
Context::ThreadResult {
thread:
ShowThreadQueryThread::NewsPost(ShowThreadQueryThreadOnNewsPost { thread_id, .. }),
..
} => Some(thread_id.clone()),
_ => None,
}
}
// `Model` describes our app state.
pub struct Model {
pub query: String,
pub context: Context,
pub refreshing_state: RefreshingState,
pub tags: Option<Vec<Tag>>,
pub read_completion_ratio: f64,
pub content_el: ElRef<HtmlElement>,
pub versions: Version,
pub catchup: Option<Catchup>,
pub last_url: Url,
pub websocket: websocket::Model,
}
#[derive(Debug)]
pub struct Version {
pub client: String,
pub server: Option<String>,
}
#[derive(Error, Debug)]
#[allow(dead_code)] // Remove once the UI is showing errors
pub enum UIError {
#[error("No error, this should never be presented to user")]
NoError,
#[error("failed to fetch {0}: {1:?}")]
FetchError(&'static str, gloo_net::Error),
#[error("{0} error decoding: {1:?}")]
FetchDecodeError(&'static str, Vec<graphql_client::Error>),
#[error("no data or errors for {0}")]
NoData(&'static str),
}
pub enum Context {
None,
SearchResult {
query: String,
results: Vec<FrontPageQuerySearchNodes>,
count: usize,
pager: FrontPageQuerySearchPageInfo,
selected_threads: HashSet<String>,
},
ThreadResult {
thread: ShowThreadQueryThread,
open_messages: HashSet<String>,
},
}
pub struct Catchup {
pub items: Vec<CatchupItem>,
}
#[derive(Debug)]
pub struct CatchupItem {
pub id: String,
pub seen: bool,
}
pub struct Tag {
pub name: String,
pub bg_color: String,
pub unread: i64,
}
#[derive(Debug, PartialEq)]
pub enum RefreshingState {
None,
Loading,
Error(String),
}
// `Msg` describes the different events you can modify state with.
#[derive(strum_macros::Display)]
pub enum Msg {
Noop,
// Tell the client to refresh its state
Refresh,
// Tell the client to reload whole page from server
Reload,
// TODO: add GoToUrl
OnUrlChanged(subs::UrlChanged),
// Tell the server to update state
RefreshStart,
RefreshDone(Option<gloo_net::Error>),
NextPage,
PreviousPage,
GoToSearchResults,
UpdateQuery(String),
SearchQuery(String),
SetUnread(String, bool),
AddTag(String, String),
RemoveTag(String, String),
FrontPageRequest {
query: String,
after: Option<String>,
before: Option<String>,
first: Option<i64>,
last: Option<i64>,
},
FrontPageResult(
Result<graphql_client::Response<graphql::front_page_query::ResponseData>, gloo_net::Error>,
),
ShowThreadRequest {
thread_id: String,
},
ShowThreadResult(
Result<graphql_client::Response<graphql::show_thread_query::ResponseData>, gloo_net::Error>,
),
CatchupRequest {
query: String,
},
CatchupResult(
Result<graphql_client::Response<graphql::catchup_query::ResponseData>, gloo_net::Error>,
),
SelectionSetNone,
SelectionSetAll,
SelectionAddTag(String),
#[allow(dead_code)]
SelectionRemoveTag(String),
SelectionMarkAsRead,
SelectionMarkAsUnread,
SelectionAddThread(String),
SelectionRemoveThread(String),
MessageCollapse(String),
MessageExpand(String),
MultiMsg(Vec<Msg>),
CopyToClipboard(String),
ScrollToTop,
WindowScrolled,
SetProgress(f64),
UpdateServerVersion(String),
CatchupStart,
CatchupKeepUnread,
CatchupMarkAsRead,
CatchupNext,
CatchupExit,
WebSocket(websocket::Msg),
WebsocketMessage(WebsocketMessage),
}

3
web/src/tailwind.css Normal file
View File

@@ -0,0 +1,3 @@
@tailwind base;
@tailwind components;
@tailwind utilities;

1683
web/src/view/mod.rs Normal file

File diff suppressed because it is too large Load Diff

220
web/src/websocket.rs Normal file
View File

@@ -0,0 +1,220 @@
use std::{collections::VecDeque, rc::Rc};
use letterbox_shared::WebsocketMessage;
use log::{error, info};
use seed::prelude::*;
use serde::{Deserialize, Serialize};
#[cfg(not(target_arch = "wasm32"))]
#[allow(dead_code)]
mod wasm_sockets {
use std::{cell::RefCell, rc::Rc};
use thiserror::Error;
use web_sys::{CloseEvent, ErrorEvent};
#[derive(Debug)]
pub struct JsValue;
#[derive(Debug)]
pub enum ConnectionStatus {
/// Connecting to a server
Connecting,
/// Connected to a server
Connected,
/// Disconnected from a server due to an error
Error,
/// Disconnected from a server without an error
Disconnected,
}
#[derive(Debug)]
pub struct EventClient {
pub status: Rc<RefCell<ConnectionStatus>>,
}
impl EventClient {
pub fn new(_: &str) -> Result<Self, WebSocketError> {
todo!("this is a mock")
}
pub fn send_string(&self, _essage: &str) -> Result<(), JsValue> {
todo!("this is a mock")
}
pub fn set_on_error(&mut self, _: Option<Box<dyn Fn(ErrorEvent)>>) {
todo!("this is a mock")
}
pub fn set_on_connection(&mut self, _: Option<Box<dyn Fn(&EventClient)>>) {
todo!("this is a mock")
}
pub fn set_on_close(&mut self, _: Option<Box<dyn Fn(CloseEvent)>>) {
todo!("this is a mock")
}
pub fn set_on_message(&mut self, _: Option<Box<dyn Fn(&EventClient, Message)>>) {
todo!("this is a mock")
}
}
#[derive(Debug, Clone)]
pub enum Message {
Text(String),
Binary(Vec<u8>),
}
#[derive(Debug, Clone, Error)]
pub enum WebSocketError {}
}
#[cfg(not(target_arch = "wasm32"))]
use wasm_sockets::{ConnectionStatus, EventClient, Message, WebSocketError};
#[cfg(target_arch = "wasm32")]
use wasm_sockets::{ConnectionStatus, EventClient, Message, WebSocketError};
use web_sys::CloseEvent;
/// Message from the server to the client.
#[derive(Serialize, Deserialize)]
pub struct ServerMessage {
pub id: usize,
pub text: String,
}
/// Message from the client to the server.
#[derive(Serialize, Deserialize)]
pub struct ClientMessage {
pub text: String,
}
//const WS_URL: &str = "wss://9000.z.xinu.tv/api/ws";
//const WS_URL: &str = "wss://9345.z.xinu.tv/api/graphql/ws";
//const WS_URL: &str = "wss://6758.z.xinu.tv/api/ws";
// ------ ------
// Model
// ------ ------
pub struct Model {
ws_url: String,
web_socket: EventClient,
web_socket_reconnector: Option<StreamHandle>,
pub updates: VecDeque<WebsocketMessage>,
}
// ------ ------
// Init
// ------ ------
pub fn init(ws_url: &str, orders: &mut impl Orders<Msg>) -> Model {
Model {
ws_url: ws_url.to_string(),
web_socket: create_websocket(ws_url, orders).unwrap(),
web_socket_reconnector: None,
updates: VecDeque::new(),
}
}
// ------ ------
// Update
// ------ ------
pub enum Msg {
WebSocketOpened,
TextMessageReceived(WebsocketMessage),
WebSocketClosed(CloseEvent),
WebSocketFailed,
ReconnectWebSocket(usize),
#[allow(dead_code)]
SendMessage(ClientMessage),
}
pub fn update(msg: Msg, model: &mut Model, orders: &mut impl Orders<Msg>) {
match msg {
Msg::WebSocketOpened => {
model.web_socket_reconnector = None;
info!("WebSocket connection is open now");
}
Msg::TextMessageReceived(msg) => {
model.updates.push_back(msg);
}
Msg::WebSocketClosed(close_event) => {
info!(
r#"==================
WebSocket connection was closed:
Clean: {0}
Code: {1}
Reason: {2}
=================="#,
close_event.was_clean(),
close_event.code(),
close_event.reason()
);
// Chrome doesn't invoke `on_error` when the connection is lost.
if !close_event.was_clean() && model.web_socket_reconnector.is_none() {
model.web_socket_reconnector = Some(
orders.stream_with_handle(streams::backoff(None, Msg::ReconnectWebSocket)),
);
}
}
Msg::WebSocketFailed => {
info!("WebSocket failed");
if model.web_socket_reconnector.is_none() {
model.web_socket_reconnector = Some(
orders.stream_with_handle(streams::backoff(None, Msg::ReconnectWebSocket)),
);
}
}
Msg::ReconnectWebSocket(retries) => {
info!("Reconnect attempt: {}", retries);
model.web_socket = create_websocket(&model.ws_url, orders).unwrap();
}
Msg::SendMessage(msg) => {
let txt = serde_json::to_string(&msg).unwrap();
model.web_socket.send_string(&txt).unwrap();
}
}
}
fn create_websocket(url: &str, orders: &impl Orders<Msg>) -> Result<EventClient, WebSocketError> {
let msg_sender = orders.msg_sender();
let mut client = EventClient::new(url)?;
client.set_on_error(Some(Box::new(|error| {
gloo_console::error!("WS: ", error);
})));
let send = msg_sender.clone();
client.set_on_connection(Some(Box::new(move |client: &EventClient| {
info!("{:#?}", client.status);
let msg = match *client.status.borrow() {
ConnectionStatus::Connecting => {
info!("Connecting...");
None
}
ConnectionStatus::Connected => Some(Msg::WebSocketOpened),
ConnectionStatus::Error => Some(Msg::WebSocketFailed),
ConnectionStatus::Disconnected => {
info!("Disconnected");
None
}
};
send(msg);
})));
let send = msg_sender.clone();
client.set_on_close(Some(Box::new(move |ev| {
info!("WS: Connection closed");
send(Some(Msg::WebSocketClosed(ev)));
})));
let send = msg_sender.clone();
client.set_on_message(Some(Box::new(move |_: &EventClient, msg: Message| {
decode_message(msg, Rc::clone(&send))
})));
Ok(client)
}
fn decode_message(message: Message, msg_sender: Rc<dyn Fn(Option<Msg>)>) {
match message {
Message::Text(txt) => {
let msg: WebsocketMessage = serde_json::from_str(&txt).unwrap_or_else(|e| {
panic!("failed to parse json into WebsocketMessage: {e}\n'{txt}'")
});
msg_sender(Some(Msg::TextMessageReceived(msg)));
}
m => error!("unexpected message type received of {m:?}"),
}
}

1
web/static/main.min.css vendored Normal file

File diff suppressed because one or more lines are too long

Some files were not shown because too many files have changed in this diff Show More