Compare commits

..

263 Commits

Author SHA1 Message Date
b622bb7d7d chore: Release
All checks were successful
Continuous integration / Test Suite (push) Successful in 47s
Continuous integration / Check (push) Successful in 6m14s
Continuous integration / Trunk (push) Successful in 41s
Continuous integration / build (push) Successful in 54s
Continuous integration / Rustfmt (push) Successful in 1m43s
Continuous integration / Disallow unused dependencies (push) Successful in 58s
2025-03-08 07:57:33 -08:00
43efdf18a0 web: reload page on fetch error. Should help with expired cookies 2025-03-08 07:57:12 -08:00
c71ab8e9e8 chore: Release 2025-03-08 07:52:40 -08:00
408d6ed8ba web: only reload on version skew in release 2025-03-08 07:52:03 -08:00
1411961e36 web: center contents in cacthup mode 2025-03-08 07:52:03 -08:00
dfd7ef466c Only rebuild on push 2025-03-08 07:52:03 -08:00
2aa3dfbd0f fix(deps): update rust crate serde_json to v1.0.140
All checks were successful
Continuous integration / Test Suite (pull_request) Successful in 46s
Continuous integration / Check (pull_request) Successful in 2m2s
Continuous integration / Trunk (pull_request) Successful in 38s
Continuous integration / build (pull_request) Successful in 52s
Continuous integration / Rustfmt (pull_request) Successful in 1m16s
Continuous integration / Disallow unused dependencies (pull_request) Successful in 55s
Continuous integration / Check (push) Successful in 38s
Continuous integration / Trunk (push) Successful in 39s
Continuous integration / Rustfmt (push) Successful in 30s
Continuous integration / build (push) Successful in 52s
Continuous integration / Disallow unused dependencies (push) Successful in 56s
Continuous integration / Test Suite (push) Successful in 4m26s
2025-03-03 09:46:00 +00:00
fba10e27cf fix(deps): update all non-major dependencies
All checks were successful
Continuous integration / Check (pull_request) Successful in 38s
Continuous integration / Test Suite (pull_request) Successful in 46s
Continuous integration / Trunk (pull_request) Successful in 40s
Continuous integration / Rustfmt (pull_request) Successful in 31s
Continuous integration / Disallow unused dependencies (pull_request) Successful in 55s
Continuous integration / build (pull_request) Successful in 3m24s
Continuous integration / Test Suite (push) Successful in 45s
Continuous integration / Trunk (push) Successful in 39s
Continuous integration / Rustfmt (push) Successful in 32s
Continuous integration / Check (push) Successful in 2m43s
Continuous integration / build (push) Successful in 57s
Continuous integration / Disallow unused dependencies (push) Successful in 2m45s
2025-03-03 06:03:25 +00:00
5417c74f9c fix(deps): update rust crate thiserror to v2.0.12
All checks were successful
Continuous integration / Check (pull_request) Successful in 43s
Continuous integration / Test Suite (pull_request) Successful in 46s
Continuous integration / Trunk (pull_request) Successful in 39s
Continuous integration / Rustfmt (pull_request) Successful in 32s
Continuous integration / Disallow unused dependencies (pull_request) Successful in 57s
Continuous integration / build (pull_request) Successful in 2m36s
Continuous integration / Check (push) Successful in 42s
Continuous integration / Test Suite (push) Successful in 45s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / build (push) Successful in 57s
Continuous integration / Trunk (push) Successful in 2m22s
Continuous integration / Disallow unused dependencies (push) Successful in 57s
2025-03-03 04:46:31 +00:00
eb0b0dbe81 chore(deps): lock file maintenance
All checks were successful
Continuous integration / Test Suite (pull_request) Successful in 45s
Continuous integration / Trunk (pull_request) Successful in 38s
Continuous integration / Rustfmt (pull_request) Successful in 30s
Continuous integration / Check (pull_request) Successful in 3m6s
Continuous integration / build (pull_request) Successful in 56s
Continuous integration / Disallow unused dependencies (pull_request) Successful in 2m21s
Continuous integration / Test Suite (push) Successful in 49s
Continuous integration / Trunk (push) Successful in 39s
Continuous integration / Check (push) Successful in 2m37s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / Disallow unused dependencies (push) Successful in 56s
Continuous integration / build (push) Successful in 4m9s
2025-03-03 00:01:36 +00:00
561f522658 fix(deps): update rust crate mailparse to v0.16.1
All checks were successful
Continuous integration / Check (pull_request) Successful in 38s
Continuous integration / Test Suite (pull_request) Successful in 44s
Continuous integration / Trunk (pull_request) Successful in 39s
Continuous integration / Rustfmt (pull_request) Successful in 32s
Continuous integration / Disallow unused dependencies (pull_request) Successful in 56s
Continuous integration / build (pull_request) Successful in 2m50s
Continuous integration / Test Suite (push) Successful in 44s
Continuous integration / Trunk (push) Successful in 39s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / Check (push) Successful in 2m4s
Continuous integration / build (push) Successful in 51s
Continuous integration / Disallow unused dependencies (push) Successful in 2m40s
2025-02-27 23:33:39 +00:00
32d2ffeb3d chore: Release
All checks were successful
Continuous integration / Test Suite (push) Successful in 44s
Continuous integration / Trunk (push) Successful in 41s
Continuous integration / Rustfmt (push) Successful in 32s
Continuous integration / Check (push) Successful in 2m37s
Continuous integration / build (push) Successful in 56s
Continuous integration / Disallow unused dependencies (push) Successful in 2m36s
2025-02-27 15:16:09 -08:00
d41946e0a5 web: change style for mark read catchup button 2025-02-27 15:15:49 -08:00
61402858f4 web: add TODO 2025-02-27 15:15:42 -08:00
17de318645 chore: Release
All checks were successful
Continuous integration / Test Suite (push) Successful in 47s
Continuous integration / Check (push) Successful in 2m0s
Continuous integration / Trunk (push) Successful in 39s
Continuous integration / build (push) Successful in 52s
Continuous integration / Rustfmt (push) Successful in 1m6s
Continuous integration / Disallow unused dependencies (push) Successful in 56s
2025-02-26 15:43:34 -08:00
3aa0144e8d web: try setting history.scroll_restoration to manual to impove inter-page flow 2025-02-26 15:43:18 -08:00
f9eafff4c7 web: add "go home" button to catchup view 2025-02-26 15:43:18 -08:00
4c6d67901d fix(deps): update rust crate uuid to v1.15.1
All checks were successful
Continuous integration / Check (pull_request) Successful in 38s
Continuous integration / Test Suite (pull_request) Successful in 43s
Continuous integration / Trunk (pull_request) Successful in 39s
Continuous integration / Rustfmt (pull_request) Successful in 32s
Continuous integration / Disallow unused dependencies (pull_request) Successful in 55s
Continuous integration / build (pull_request) Successful in 3m42s
Continuous integration / Check (push) Successful in 37s
Continuous integration / Test Suite (push) Successful in 45s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / build (push) Successful in 50s
Continuous integration / Trunk (push) Successful in 1m39s
Continuous integration / Disallow unused dependencies (push) Successful in 55s
2025-02-26 21:15:57 +00:00
e9aa97a089 fix(deps): update rust crate chrono to v0.4.40
All checks were successful
Continuous integration / Check (pull_request) Successful in 37s
Continuous integration / Test Suite (pull_request) Successful in 43s
Continuous integration / Trunk (pull_request) Successful in 38s
Continuous integration / build (pull_request) Successful in 51s
Continuous integration / Rustfmt (pull_request) Successful in 1m4s
Continuous integration / Disallow unused dependencies (pull_request) Successful in 55s
Continuous integration / Check (push) Successful in 37s
Continuous integration / Test Suite (push) Successful in 43s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / build (push) Successful in 50s
Continuous integration / Trunk (push) Successful in 1m37s
Continuous integration / Disallow unused dependencies (push) Successful in 55s
2025-02-26 08:46:20 +00:00
a82b047f75 fix(deps): update rust crate uuid to v1.15.0
All checks were successful
Continuous integration / Check (pull_request) Successful in 39s
Continuous integration / Test Suite (pull_request) Successful in 43s
Continuous integration / Trunk (pull_request) Successful in 38s
Continuous integration / Rustfmt (pull_request) Successful in 57s
Continuous integration / build (pull_request) Successful in 50s
Continuous integration / Disallow unused dependencies (pull_request) Successful in 2m27s
Continuous integration / Check (push) Successful in 38s
Continuous integration / Test Suite (push) Successful in 43s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / build (push) Successful in 49s
Continuous integration / Trunk (push) Successful in 1m31s
Continuous integration / Disallow unused dependencies (push) Successful in 56s
2025-02-26 06:16:01 +00:00
9a8b44a8df fix(deps): update all non-major dependencies to 0.0.40
All checks were successful
Continuous integration / Test Suite (pull_request) Successful in 44s
Continuous integration / Check (pull_request) Successful in 1m48s
Continuous integration / Trunk (pull_request) Successful in 38s
Continuous integration / build (pull_request) Successful in 50s
Continuous integration / Rustfmt (pull_request) Successful in 1m3s
Continuous integration / Disallow unused dependencies (pull_request) Successful in 1m0s
Continuous integration / Test Suite (push) Successful in 43s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / Check (push) Successful in 1m49s
Continuous integration / Rustfmt (push) Successful in 30s
Continuous integration / Disallow unused dependencies (push) Successful in 54s
Continuous integration / build (push) Successful in 2m43s
2025-02-26 04:47:10 +00:00
a96693004c chore: Release
All checks were successful
Continuous integration / Test Suite (push) Successful in 42s
Continuous integration / Check (push) Successful in 2m9s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / build (push) Successful in 50s
Continuous integration / Rustfmt (push) Successful in 1m7s
Continuous integration / Disallow unused dependencies (push) Successful in 56s
2025-02-25 20:43:47 -08:00
ed9fe11fbf web: trimmed views for catchup mode 2025-02-25 20:43:27 -08:00
09fb14a796 chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 37s
Continuous integration / Test Suite (push) Successful in 43s
Continuous integration / Trunk (push) Successful in 37s
Continuous integration / Rustfmt (push) Successful in 52s
Continuous integration / build (push) Successful in 50s
Continuous integration / Disallow unused dependencies (push) Successful in 2m28s
2025-02-25 20:08:44 -08:00
58a7936bba web: address lint 2025-02-25 20:08:31 -08:00
cd0ee361f5 chore: Release 2025-02-25 20:06:18 -08:00
77bd5abe0d Don't do incremental builds when release 2025-02-25 20:06:11 -08:00
450c5496b3 chore: Release 2025-02-25 20:04:01 -08:00
4411e45a3c Don't allow warnings when publishing 2025-02-25 20:03:40 -08:00
e7d20896d5 web: remove unnecessary Msg variant 2025-02-25 16:20:32 -08:00
32a1115abd chore: Release
Some checks failed
Continuous integration / Check (push) Successful in 38s
Continuous integration / Test Suite (push) Successful in 45s
Continuous integration / Trunk (push) Failing after 36s
Continuous integration / Rustfmt (push) Successful in 30s
Continuous integration / Disallow unused dependencies (push) Successful in 54s
Continuous integration / build (push) Successful in 2m44s
2025-02-25 15:58:46 -08:00
4982057500 web: more scroll to top improvements by reworking URL changes 2025-02-25 15:58:24 -08:00
8977f8bab5 chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 40s
Continuous integration / Test Suite (push) Successful in 43s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / Rustfmt (push) Successful in 30s
Continuous integration / build (push) Successful in 49s
Continuous integration / Disallow unused dependencies (push) Successful in 2m39s
2025-02-25 13:51:38 -08:00
0962a6b3cf web: improve scroll-to-top behavior 2025-02-25 13:51:11 -08:00
3c72929a4f web: enable properly styled buttons 2025-02-25 10:26:16 -08:00
e4eb495a70 web: properly exit catchup mode when done 2025-02-25 10:25:28 -08:00
00e8b0342e chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 37s
Continuous integration / Test Suite (push) Successful in 41s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / Rustfmt (push) Successful in 30s
Continuous integration / Disallow unused dependencies (push) Successful in 54s
Continuous integration / build (push) Successful in 2m46s
2025-02-24 18:41:19 -08:00
b1f9867c06 web: remove debug statement 2025-02-24 18:41:00 -08:00
77943b3570 web: scroll to top on page changes 2025-02-24 18:39:47 -08:00
45e4edb1dd web: add icons to catchup controls 2025-02-24 17:09:16 -08:00
9bf53afebf server: sort catchup ids by timestamp across all sources 2025-02-24 17:08:57 -08:00
e1a502ac4b chore: Release
All checks were successful
Continuous integration / Test Suite (push) Successful in 44s
Continuous integration / Check (push) Successful in 2m1s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / Rustfmt (push) Successful in 1m5s
Continuous integration / build (push) Successful in 50s
Continuous integration / Disallow unused dependencies (push) Successful in 2m38s
2025-02-24 14:56:17 -08:00
9346c46e62 web: change exit catchup behavior to view current message 2025-02-24 14:55:51 -08:00
1452746305 chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 47s
Continuous integration / Test Suite (push) Successful in 44s
Continuous integration / Trunk (push) Successful in 39s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / Disallow unused dependencies (push) Successful in 56s
Continuous integration / build (push) Successful in 2m43s
2025-02-24 14:38:44 -08:00
2e526dace1 Implement catchup mode
Show original/delivered To if no xinu.tv addresses in To/CC fields
2025-02-24 14:38:18 -08:00
76be5b7cac fix(deps): update rust crate clap to v4.5.31
All checks were successful
Continuous integration / Check (pull_request) Successful in 39s
Continuous integration / Test Suite (pull_request) Successful in 43s
Continuous integration / Trunk (pull_request) Successful in 38s
Continuous integration / Rustfmt (pull_request) Successful in 32s
Continuous integration / Disallow unused dependencies (pull_request) Successful in 56s
Continuous integration / build (pull_request) Successful in 2m52s
Continuous integration / Check (push) Successful in 38s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / Test Suite (push) Successful in 1m45s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / Disallow unused dependencies (push) Successful in 55s
Continuous integration / build (push) Successful in 3m25s
2025-02-24 16:00:55 +00:00
3f0b2caedf fix(deps): update rust crate scraper to 0.23.0
All checks were successful
Continuous integration / Check (pull_request) Successful in 38s
Continuous integration / Test Suite (pull_request) Successful in 44s
Continuous integration / Trunk (pull_request) Successful in 39s
Continuous integration / Rustfmt (pull_request) Successful in 32s
Continuous integration / Disallow unused dependencies (pull_request) Successful in 57s
Continuous integration / build (pull_request) Successful in 2m46s
Continuous integration / Check (push) Successful in 39s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / build (push) Successful in 51s
Continuous integration / Test Suite (push) Successful in 3m28s
Continuous integration / Disallow unused dependencies (push) Successful in 56s
2025-02-24 09:31:24 +00:00
ec6dc35ca8 chore(deps): lock file maintenance
All checks were successful
Continuous integration / Check (pull_request) Successful in 43s
Continuous integration / Test Suite (pull_request) Successful in 47s
Continuous integration / Trunk (pull_request) Successful in 39s
Continuous integration / Rustfmt (pull_request) Successful in 32s
Continuous integration / Disallow unused dependencies (pull_request) Successful in 1m2s
Continuous integration / build (pull_request) Successful in 3m44s
Continuous integration / Check (push) Successful in 39s
Continuous integration / Test Suite (push) Successful in 46s
Continuous integration / Trunk (push) Successful in 39s
Continuous integration / Rustfmt (push) Successful in 50s
Continuous integration / build (push) Successful in 54s
Continuous integration / Disallow unused dependencies (push) Successful in 3m23s
2025-02-24 00:01:18 +00:00
01e1ca927e chore: Release
All checks were successful
Continuous integration / Test Suite (push) Successful in 44s
Continuous integration / Check (push) Successful in 2m0s
Continuous integration / Trunk (push) Successful in 40s
Continuous integration / Rustfmt (push) Successful in 1m0s
Continuous integration / build (push) Successful in 51s
Continuous integration / Disallow unused dependencies (push) Successful in 2m34s
2025-02-23 11:47:04 -08:00
1cc52d6c96 web: show X-Original-To: if To: is missing, fallback to Delivered-To: 2025-02-23 11:46:21 -08:00
e6b3a5b5a9 notmuch & server: plumb Delivered-To and X-Original-To headers 2025-02-23 09:37:09 -08:00
bc4b15a5aa chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 38s
Continuous integration / Test Suite (push) Successful in 42s
Continuous integration / Trunk (push) Successful in 39s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / Disallow unused dependencies (push) Successful in 55s
Continuous integration / build (push) Successful in 2m40s
2025-02-22 17:58:37 -08:00
00f61cf6be server: recursively descend email threads to find all unread recipients 2025-02-22 17:58:07 -08:00
52e24437bd chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 38s
Continuous integration / Test Suite (push) Successful in 45s
Continuous integration / Trunk (push) Successful in 39s
Continuous integration / Rustfmt (push) Successful in 51s
Continuous integration / build (push) Successful in 51s
Continuous integration / Disallow unused dependencies (push) Successful in 2m25s
2025-02-22 17:27:54 -08:00
393ffc8506 notmuch: normalize unread_recipients to lower case 2025-02-22 17:27:30 -08:00
2b6cb6ec6e chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 42s
Continuous integration / Test Suite (push) Successful in 44s
Continuous integration / Trunk (push) Successful in 39s
Continuous integration / Rustfmt (push) Successful in 58s
Continuous integration / build (push) Successful in 53s
Continuous integration / Disallow unused dependencies (push) Successful in 2m37s
2025-02-22 17:24:31 -08:00
0cba3a624c web: add de/select all checkbox with tristate 2025-02-22 17:24:18 -08:00
73433711ca fix(deps): update rust crate xtracing to 0.3.0
All checks were successful
Continuous integration / Check (pull_request) Successful in 38s
Continuous integration / Test Suite (pull_request) Successful in 43s
Continuous integration / Rustfmt (pull_request) Successful in 31s
Continuous integration / build (pull_request) Successful in 51s
Continuous integration / Trunk (pull_request) Successful in 2m32s
Continuous integration / Disallow unused dependencies (pull_request) Successful in 57s
Continuous integration / Test Suite (push) Successful in 49s
Continuous integration / Trunk (push) Successful in 40s
Continuous integration / Check (push) Successful in 1m54s
Continuous integration / Rustfmt (push) Successful in 32s
Continuous integration / Disallow unused dependencies (push) Successful in 59s
Continuous integration / build (push) Successful in 2m28s
2025-02-23 00:02:30 +00:00
965afa6871 Merge pull request 'fix(deps): update rust crate seed_hooks to 0.4.0' (#48) from renovate/seed_hooks-0.x into master
All checks were successful
Continuous integration / Test Suite (push) Successful in 44s
Continuous integration / Check (push) Successful in 1m45s
Continuous integration / Trunk (push) Successful in 40s
Continuous integration / Rustfmt (push) Successful in 1m3s
Continuous integration / build (push) Successful in 53s
Continuous integration / Disallow unused dependencies (push) Successful in 2m33s
Reviewed-on: #48
2025-02-22 15:49:50 -08:00
e70dbaf917 fix(deps): update rust crate seed_hooks to 0.4.0
All checks were successful
Continuous integration / Check (push) Successful in 39s
Continuous integration / Test Suite (push) Successful in 44s
Continuous integration / Rustfmt (push) Successful in 32s
Continuous integration / Trunk (push) Successful in 1m23s
Continuous integration / build (push) Successful in 51s
Continuous integration / Check (pull_request) Successful in 38s
Continuous integration / Test Suite (pull_request) Successful in 44s
Continuous integration / Disallow unused dependencies (push) Successful in 2m38s
Continuous integration / Trunk (pull_request) Successful in 38s
Continuous integration / build (pull_request) Successful in 50s
Continuous integration / Rustfmt (pull_request) Successful in 55s
Continuous integration / Disallow unused dependencies (pull_request) Successful in 57s
2025-02-22 15:18:33 -08:00
6b4ce11743 fix(deps): update rust crate xtracing to v0.2.1
All checks were successful
Continuous integration / Check (pull_request) Successful in 37s
Continuous integration / Test Suite (pull_request) Successful in 43s
Continuous integration / Trunk (pull_request) Successful in 39s
Continuous integration / Rustfmt (pull_request) Successful in 31s
Continuous integration / Disallow unused dependencies (pull_request) Successful in 56s
Continuous integration / build (pull_request) Successful in 2m38s
Continuous integration / Check (push) Successful in 38s
Continuous integration / Trunk (push) Successful in 39s
Continuous integration / Rustfmt (push) Successful in 32s
Continuous integration / build (push) Successful in 49s
Continuous integration / Disallow unused dependencies (push) Successful in 57s
Continuous integration / Test Suite (push) Successful in 3m44s
2025-02-22 22:31:55 +00:00
d1980a55a7 fix(deps): update rust crate cacher to v0.1.5
All checks were successful
Continuous integration / Check (pull_request) Successful in 36s
Continuous integration / Test Suite (pull_request) Successful in 42s
Continuous integration / Trunk (pull_request) Successful in 38s
Continuous integration / build (pull_request) Successful in 50s
Continuous integration / Rustfmt (pull_request) Successful in 52s
Continuous integration / Disallow unused dependencies (pull_request) Successful in 56s
Continuous integration / Test Suite (push) Successful in 44s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / Check (push) Successful in 2m4s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / Disallow unused dependencies (push) Successful in 55s
Continuous integration / build (push) Successful in 2m41s
2025-02-22 21:16:46 +00:00
8b78b39d4c chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 37s
Continuous integration / Test Suite (push) Successful in 42s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / Disallow unused dependencies (push) Successful in 55s
Continuous integration / build (push) Successful in 2m29s
2025-02-22 13:10:03 -08:00
ae17651eb5 Normalize Justfile config 2025-02-22 13:08:15 -08:00
22fd8409f6 chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 40s
Continuous integration / Test Suite (push) Successful in 42s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / Rustfmt (push) Successful in 52s
Continuous integration / build (push) Successful in 50s
Continuous integration / Disallow unused dependencies (push) Successful in 2m21s
2025-02-22 12:41:57 -08:00
d0a4ba417f chore: Release 2025-02-22 12:41:30 -08:00
7b09b098a4 chore: Release 2025-02-22 12:41:15 -08:00
bd4c10a8fb Specify registry for all letterbox-* deps 2025-02-22 12:41:15 -08:00
ed3c5f152e chore: Release 2025-02-22 12:41:15 -08:00
63232d1e92 Publish only to xinu 2025-02-22 12:41:15 -08:00
4a3eba80d5 chore: Release 2025-02-22 12:41:15 -08:00
71d3745342 Try relative paths for letterbox-* deps 2025-02-22 12:41:14 -08:00
5fdc98633d chore: Release 2025-02-22 12:39:39 -08:00
57877f268d Set repository in workspace 2025-02-22 12:39:20 -08:00
871a93d58f Move most package metadata to workspace 2025-02-22 12:39:20 -08:00
4b7cbd4f9b chore: Release 2025-02-22 12:39:19 -08:00
aa2a9815df Add automatic per-email address unread folders 2025-02-22 12:38:57 -08:00
2e5b18a008 Fix cargo-udeps build step 2025-02-22 12:37:27 -08:00
d0a38114cc Add cargo-udeps build step 2025-02-22 12:37:27 -08:00
ccc1d516c7 fix(deps): update rust crate letterbox-notmuch to 0.8.0
All checks were successful
Continuous integration / Test Suite (pull_request) Successful in 40s
Continuous integration / Trunk (pull_request) Successful in 38s
Continuous integration / Check (pull_request) Successful in 1m44s
Continuous integration / Rustfmt (pull_request) Successful in 33s
Continuous integration / build (pull_request) Successful in 1m57s
Continuous integration / Check (push) Successful in 38s
Continuous integration / Trunk (push) Successful in 40s
Continuous integration / Test Suite (push) Successful in 1m45s
Continuous integration / Rustfmt (push) Successful in 33s
Continuous integration / build (push) Successful in 2m18s
2025-02-22 19:15:52 +00:00
246b710fdd fix(deps): update rust crate log to v0.4.26
All checks were successful
Continuous integration / Check (pull_request) Successful in 35s
Continuous integration / Test Suite (pull_request) Successful in 39s
Continuous integration / Trunk (pull_request) Successful in 37s
Continuous integration / Rustfmt (pull_request) Successful in 31s
Continuous integration / build (pull_request) Successful in 47s
Continuous integration / Test Suite (push) Successful in 40s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / Check (push) Successful in 1m49s
Continuous integration / Rustfmt (push) Successful in 32s
Continuous integration / build (push) Successful in 2m45s
2025-02-21 05:46:06 +00:00
1a21c9fa8e fix(deps): update rust crate uuid to v1.14.0
All checks were successful
Continuous integration / Check (pull_request) Successful in 53s
Continuous integration / Test Suite (pull_request) Successful in 39s
Continuous integration / Trunk (pull_request) Successful in 37s
Continuous integration / Rustfmt (pull_request) Successful in 33s
Continuous integration / build (pull_request) Successful in 1m18s
Continuous integration / Check (push) Successful in 35s
Continuous integration / Test Suite (push) Successful in 39s
Continuous integration / Trunk (push) Successful in 37s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / build (push) Successful in 45s
2025-02-21 00:30:51 +00:00
9fd912b1d4 fix(deps): update rust crate serde to v1.0.218
All checks were successful
Continuous integration / Test Suite (pull_request) Successful in 41s
Continuous integration / Trunk (pull_request) Successful in 38s
Continuous integration / Check (pull_request) Successful in 1m51s
Continuous integration / Rustfmt (pull_request) Successful in 32s
Continuous integration / build (pull_request) Successful in 3m5s
Continuous integration / Check (push) Successful in 48s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / build (push) Successful in 48s
Continuous integration / Test Suite (push) Successful in 2m52s
2025-02-20 05:31:10 +00:00
9ded32f97b fix(deps): update rust crate anyhow to v1.0.96
All checks were successful
Continuous integration / Test Suite (pull_request) Successful in 40s
Continuous integration / Trunk (pull_request) Successful in 37s
Continuous integration / Check (pull_request) Successful in 1m52s
Continuous integration / Rustfmt (pull_request) Successful in 32s
Continuous integration / build (pull_request) Successful in 2m7s
Continuous integration / Check (push) Successful in 35s
Continuous integration / Test Suite (push) Successful in 39s
Continuous integration / Rustfmt (push) Successful in 32s
Continuous integration / Trunk (push) Successful in 1m26s
Continuous integration / build (push) Successful in 46s
2025-02-20 03:16:55 +00:00
10aac046bc fix(deps): update rust crate serde_json to v1.0.139
All checks were successful
Continuous integration / Check (pull_request) Successful in 36s
Continuous integration / Test Suite (pull_request) Successful in 40s
Continuous integration / Rustfmt (pull_request) Successful in 31s
Continuous integration / Trunk (pull_request) Successful in 1m24s
Continuous integration / build (pull_request) Successful in 47s
Continuous integration / Test Suite (push) Successful in 40s
Continuous integration / Trunk (push) Successful in 37s
Continuous integration / Check (push) Successful in 1m42s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / build (push) Successful in 1m55s
2025-02-20 03:00:53 +00:00
f4527baf89 fix(deps): update rust crate seed_hooks to 0.4.0
All checks were successful
Continuous integration / Check (pull_request) Successful in 1m25s
Continuous integration / Test Suite (pull_request) Successful in 39s
Continuous integration / Rustfmt (pull_request) Successful in 33s
Continuous integration / build (pull_request) Successful in 46s
Continuous integration / Trunk (pull_request) Successful in 1m37s
Continuous integration / Check (push) Successful in 54s
Continuous integration / Trunk (push) Successful in 36s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / build (push) Successful in 46s
Continuous integration / Test Suite (push) Successful in 4m11s
2025-02-18 20:15:48 +00:00
11ec5bf747 fix(deps): update rust crate uuid to v1.13.2
All checks were successful
Continuous integration / Test Suite (pull_request) Successful in 40s
Continuous integration / Check (pull_request) Successful in 1m30s
Continuous integration / Trunk (pull_request) Successful in 37s
Continuous integration / Rustfmt (pull_request) Successful in 47s
Continuous integration / build (pull_request) Successful in 48s
Continuous integration / Check (push) Successful in 36s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / Test Suite (push) Successful in 2m8s
Continuous integration / build (push) Successful in 47s
2025-02-17 23:46:05 +00:00
6a53679755 fix(deps): update rust crate clap to v4.5.30
All checks were successful
Continuous integration / Check (pull_request) Successful in 37s
Continuous integration / Trunk (pull_request) Successful in 37s
Continuous integration / Rustfmt (pull_request) Successful in 31s
Continuous integration / Test Suite (pull_request) Successful in 2m0s
Continuous integration / build (pull_request) Successful in 46s
Continuous integration / Check (push) Successful in 35s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / Test Suite (push) Successful in 1m55s
Continuous integration / build (push) Successful in 47s
2025-02-17 19:15:50 +00:00
7bedec0692 chore(deps): lock file maintenance
All checks were successful
Continuous integration / Test Suite (pull_request) Successful in 41s
Continuous integration / Check (pull_request) Successful in 1m34s
Continuous integration / Trunk (pull_request) Successful in 37s
Continuous integration / Rustfmt (pull_request) Successful in 50s
Continuous integration / build (pull_request) Successful in 47s
Continuous integration / Check (push) Successful in 36s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / build (push) Successful in 47s
Continuous integration / Test Suite (push) Successful in 2m48s
2025-02-17 00:01:14 +00:00
78feb95811 chore: Release
All checks were successful
Continuous integration / Test Suite (push) Successful in 1m27s
Continuous integration / Check (push) Successful in 1m39s
Continuous integration / Trunk (push) Successful in 45s
Continuous integration / Rustfmt (push) Successful in 53s
Continuous integration / build (push) Successful in 1m10s
2025-02-15 14:49:11 -08:00
3aad2bb80e web: another attempt to fix progress bar 2025-02-15 14:47:32 -08:00
0df8de3661 web: use seed_hooks ability to create ev handlers 2025-02-15 14:47:32 -08:00
83ecc73fbd fix(deps): update rust crate seed_hooks to v0.1.16
All checks were successful
Continuous integration / Test Suite (pull_request) Successful in 41s
Continuous integration / Check (pull_request) Successful in 1m24s
Continuous integration / Trunk (pull_request) Successful in 39s
Continuous integration / Rustfmt (pull_request) Successful in 48s
Continuous integration / build (pull_request) Successful in 48s
Continuous integration / Check (push) Successful in 35s
Continuous integration / Trunk (push) Successful in 37s
Continuous integration / Test Suite (push) Successful in 1m41s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / build (push) Successful in 1m49s
2025-02-14 01:15:49 +00:00
c10313cd12 fix(deps): update rust crate letterbox-shared to 0.6.0
All checks were successful
Continuous integration / Test Suite (pull_request) Successful in 39s
Continuous integration / Check (pull_request) Successful in 1m24s
Continuous integration / Trunk (pull_request) Successful in 37s
Continuous integration / Rustfmt (pull_request) Successful in 48s
Continuous integration / build (pull_request) Successful in 46s
Continuous integration / Check (push) Successful in 35s
Continuous integration / Trunk (push) Successful in 37s
Continuous integration / Rustfmt (push) Successful in 32s
Continuous integration / Test Suite (push) Successful in 1m44s
Continuous integration / build (push) Successful in 46s
2025-02-13 23:31:34 +00:00
4c98bcd9cb Merge pull request 'fix(deps): update rust crate letterbox-notmuch to 0.6.0' (#34) from renovate/letterbox-notmuch-0.x into master
All checks were successful
Continuous integration / Check (push) Successful in 35s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / Rustfmt (push) Successful in 32s
Continuous integration / Test Suite (push) Successful in 1m52s
Continuous integration / build (push) Successful in 46s
Reviewed-on: #34
2025-02-13 15:17:39 -08:00
004de235a8 fix(deps): update rust crate letterbox-notmuch to 0.6.0
Some checks failed
renovate/artifacts Artifact file update failure
Continuous integration / Check (push) Successful in 36s
Continuous integration / Test Suite (push) Successful in 39s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / Rustfmt (push) Successful in 52s
Continuous integration / build (push) Successful in 47s
Continuous integration / Test Suite (pull_request) Successful in 40s
Continuous integration / Check (pull_request) Successful in 1m20s
Continuous integration / Trunk (pull_request) Successful in 37s
Continuous integration / Rustfmt (pull_request) Successful in 51s
Continuous integration / build (pull_request) Successful in 48s
2025-02-13 23:16:31 +00:00
90dbeb6f20 chore: Release
All checks were successful
Continuous integration / Test Suite (push) Successful in 40s
Continuous integration / Trunk (push) Successful in 37s
Continuous integration / Check (push) Successful in 1m27s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / build (push) Successful in 1m54s
2025-02-13 15:09:58 -08:00
9aa298febe web: use crate version of seed_hooks 2025-02-13 15:09:34 -08:00
5a13a497dc chore: Release 2025-02-13 14:30:47 -08:00
37711e14dd chore: Release 2025-02-13 14:01:24 -08:00
e89fd28707 web: pin seed_hooks version 2025-02-13 14:01:06 -08:00
7a91ee2f49 chore: Release 2025-02-13 13:29:52 -08:00
4b76ea5392 Justfile: run release w/ --no-confirm 2025-02-13 13:29:29 -08:00
d2a81b7bd9 Revert "Justfile: try without --workspace flag"
This reverts commit 9dd39509b5.
2025-02-13 13:29:17 -08:00
9dd39509b5 Justfile: try without --workspace flag 2025-02-13 13:28:35 -08:00
d605bcfe7a web: move to version 0.3 to sync with other crates 2025-02-13 13:25:01 -08:00
73abdb535a Justfile: actually call _release on build 2025-02-13 11:56:09 -08:00
ab9506c4f6 Starter justfile that will hopefully replace make 2025-02-13 11:51:59 -08:00
994a629401 web: update letterbox-notmuch dependency
All checks were successful
Continuous integration / Check (push) Successful in 37s
Continuous integration / Trunk (push) Successful in 50s
Continuous integration / Rustfmt (push) Successful in 32s
Continuous integration / build (push) Successful in 46s
Continuous integration / Test Suite (push) Successful in 2m41s
2025-02-13 11:37:32 -08:00
00c55160a7 Add web back to workspace 2025-02-13 11:31:43 -08:00
e3c6edb894 Merge pull request 'fix(deps): update rust crate letterbox-shared to 0.3.0' (#35) from renovate/letterbox-shared-0.x into master
Some checks failed
Continuous integration / Test Suite (push) Successful in 41s
Continuous integration / Trunk (push) Failing after 32s
Continuous integration / Check (push) Successful in 1m23s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / build (push) Successful in 1m49s
Reviewed-on: #35
2025-02-13 11:31:21 -08:00
4574c016cd fix(deps): update rust crate letterbox-shared to 0.3.0
Some checks failed
renovate/artifacts Artifact file update failure
Continuous integration / Test Suite (push) Successful in 39s
Continuous integration / Check (push) Successful in 1m12s
Continuous integration / Trunk (push) Failing after 33s
Continuous integration / Rustfmt (push) Successful in 47s
Continuous integration / build (push) Successful in 46s
Continuous integration / Test Suite (pull_request) Successful in 39s
Continuous integration / Check (pull_request) Successful in 1m16s
Continuous integration / Trunk (pull_request) Failing after 32s
Continuous integration / Rustfmt (pull_request) Successful in 47s
Continuous integration / build (pull_request) Successful in 46s
2025-02-13 18:45:52 +00:00
ca6c19f4c8 chore: Release
Some checks failed
Continuous integration / Check (push) Successful in 35s
Continuous integration / Test Suite (push) Successful in 39s
Continuous integration / Trunk (push) Failing after 6m23s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / build (push) Successful in 1m47s
2025-02-13 10:32:43 -08:00
0f51f6e71f server: copy vars.css from web so I can publish release 2025-02-13 10:32:20 -08:00
4bd672bf94 chore: Release 2025-02-13 10:18:40 -08:00
136fd77f3b Add server back to workspace 2025-02-13 10:18:30 -08:00
ee9b6be95e Temporarily remove web and server from workspace to publish other crates
Some checks failed
Continuous integration / Test Suite (push) Successful in 28s
Continuous integration / Check (push) Successful in 42s
Continuous integration / Trunk (push) Failing after 28s
Continuous integration / Rustfmt (push) Successful in 36s
Continuous integration / build (push) Successful in 27s
2025-02-13 10:16:55 -08:00
38c553d385 Use packaged version of crates 2025-02-13 10:16:36 -08:00
1b073665a7 chore: Release 2025-02-13 09:49:11 -08:00
2076596f50 Rename all crates to start with letterbox- 2025-02-13 09:48:24 -08:00
d1beaded09 Update Cargo.toml for packaging 2025-02-13 09:47:41 -08:00
2562bdfedf server: tool for testing inline code 2025-02-13 09:47:41 -08:00
86c6face7d server: sql to debug search indexing w/ postgres 2025-02-13 09:47:41 -08:00
4a7ff8bf7b notmuch: exclude testdata dir when packaging
Contains filenames cargo package doesn't like
2025-02-13 09:47:41 -08:00
8c280d3616 web: fix styling for slashdot's story byline 2025-02-13 09:47:41 -08:00
eb4d4164ef web: fix progress bar on mobile 2025-02-13 09:47:41 -08:00
c7740811bf fix(deps): update rust crate opentelemetry to 0.28.0
All checks were successful
Continuous integration / Test Suite (pull_request) Successful in 38s
Continuous integration / Check (pull_request) Successful in 1m22s
Continuous integration / Trunk (pull_request) Successful in 38s
Continuous integration / Rustfmt (pull_request) Successful in 51s
Continuous integration / build (pull_request) Successful in 46s
Continuous integration / Test Suite (push) Successful in 39s
Continuous integration / Trunk (push) Successful in 37s
Continuous integration / Check (push) Successful in 1m22s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / build (push) Successful in 1m49s
2025-02-12 21:30:57 +00:00
55679cf61b fix(deps): update rust crate xtracing to 0.2.0
All checks were successful
Continuous integration / Test Suite (pull_request) Successful in 38s
Continuous integration / Check (pull_request) Successful in 1m18s
Continuous integration / Trunk (pull_request) Successful in 37s
Continuous integration / Rustfmt (pull_request) Successful in 50s
Continuous integration / build (pull_request) Successful in 46s
Continuous integration / Test Suite (push) Successful in 39s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / Check (push) Successful in 1m35s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / build (push) Successful in 1m42s
2025-02-12 21:15:55 +00:00
1b1c80b1b8 web: annotate some more (temporary) dead code
All checks were successful
Continuous integration / Check (push) Successful in 36s
Continuous integration / Trunk (push) Successful in 37s
Continuous integration / Rustfmt (push) Successful in 32s
Continuous integration / Test Suite (push) Successful in 1m52s
Continuous integration / build (push) Successful in 46s
2025-02-12 13:03:45 -08:00
8743b1f56b web: install trunk in CI
Some checks failed
Continuous integration / Test Suite (push) Successful in 39s
Continuous integration / Check (push) Successful in 1m17s
Continuous integration / Rustfmt (push) Successful in 50s
Continuous integration / build (push) Successful in 1m41s
Continuous integration / Trunk (push) Failing after 3m38s
2025-02-12 11:46:31 -08:00
eb6f1b5346 web: run trunk build in CI
Some checks failed
Continuous integration / Check (push) Successful in 35s
Continuous integration / Test Suite (push) Successful in 40s
Continuous integration / Rustfmt (push) Successful in 32s
Continuous integration / Trunk (push) Failing after 58s
Continuous integration / build (push) Successful in 46s
2025-02-12 09:03:37 -08:00
6bb6d380a9 Bumping version to 0.0.144
All checks were successful
Continuous integration / Check (push) Successful in 37s
Continuous integration / Rustfmt (push) Successful in 51s
Continuous integration / Test Suite (push) Successful in 1m44s
Continuous integration / build (push) Successful in 53s
2025-02-12 08:50:09 -08:00
39eea04bf6 Bumping version to 0.0.143 2025-02-12 08:50:04 -08:00
2711147cd6 web: hide nautilus ads 2025-02-12 08:50:04 -08:00
083b7c9f1c Merge pull request 'fix(deps): update rust crate thiserror to v2' (#27) from renovate/thiserror-2.x into master
All checks were successful
Continuous integration / Check (push) Successful in 34s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / Test Suite (push) Successful in 1m45s
Continuous integration / build (push) Successful in 45s
Reviewed-on: #27
2025-02-11 20:27:41 -08:00
5ade886a72 fix(deps): update rust-wasm-bindgen monorepo
All checks were successful
Continuous integration / Test Suite (pull_request) Successful in 38s
Continuous integration / Check (pull_request) Successful in 1m32s
Continuous integration / Rustfmt (pull_request) Successful in 31s
Continuous integration / build (pull_request) Successful in 2m0s
Continuous integration / Test Suite (push) Successful in 38s
Continuous integration / Check (push) Successful in 1m32s
Continuous integration / Rustfmt (push) Successful in 33s
Continuous integration / build (push) Successful in 2m9s
2025-02-12 00:46:04 +00:00
52575e13f6 Bumping version to 0.0.142
All checks were successful
Continuous integration / Check (push) Successful in 37s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / Test Suite (push) Successful in 1m32s
Continuous integration / build (push) Successful in 46s
2025-02-11 16:42:24 -08:00
3aaee8add3 web: rollback wasm-bindgen 2025-02-11 16:42:10 -08:00
5e188a70f9 fix(deps): update rust crate clap to v4.5.29
All checks were successful
Continuous integration / Test Suite (pull_request) Successful in 39s
Continuous integration / Rustfmt (pull_request) Successful in 49s
Continuous integration / build (pull_request) Successful in 45s
Continuous integration / Check (pull_request) Successful in 3m7s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / build (push) Successful in 44s
Continuous integration / Check (push) Successful in 1m46s
Continuous integration / Test Suite (push) Successful in 5m36s
2025-02-11 20:00:45 +00:00
f9e5c87d2b fix(deps): update rust-wasm-bindgen monorepo
All checks were successful
Continuous integration / Test Suite (pull_request) Successful in 38s
Continuous integration / Check (pull_request) Successful in 1m20s
Continuous integration / Rustfmt (pull_request) Successful in 32s
Continuous integration / build (pull_request) Successful in 1m36s
Continuous integration / Rustfmt (push) Successful in 32s
Continuous integration / Check (push) Successful in 1m22s
Continuous integration / build (push) Successful in 45s
Continuous integration / Test Suite (push) Successful in 5m40s
2025-02-11 16:46:05 +00:00
7d40cf8a4a Bumping version to 0.0.141
All checks were successful
Continuous integration / Check (push) Successful in 37s
Continuous integration / Test Suite (push) Successful in 41s
Continuous integration / Rustfmt (push) Successful in 1m1s
Continuous integration / build (push) Successful in 47s
2025-02-11 08:36:30 -08:00
1836026736 update cacher dependency 2025-02-11 08:36:24 -08:00
79db0f8cfa Bumping version to 0.0.140
All checks were successful
Continuous integration / Check (push) Successful in 36s
Continuous integration / Rustfmt (push) Successful in 32s
Continuous integration / build (push) Successful in 46s
Continuous integration / Test Suite (push) Successful in 5m10s
2025-02-10 17:44:22 -08:00
95c29dc73c web: CSS indent lists 2025-02-10 17:44:07 -08:00
2b0ee42cdc Bumping version to 0.0.139
Some checks are pending
Continuous integration / Check (push) Waiting to run
Continuous integration / Test Suite (push) Waiting to run
Continuous integration / Rustfmt (push) Waiting to run
Continuous integration / build (push) Waiting to run
2025-02-10 17:33:46 -08:00
c90ac1d4fc web: ping web-sys to 0.2.95, to work with CLI in nixos 2025-02-10 17:33:17 -08:00
a9803bb6a1 fix(deps): update rust crate thiserror to v2
Some checks failed
renovate/artifacts Artifact file update failure
Continuous integration / Check (push) Successful in 35s
Continuous integration / Rustfmt (push) Successful in 32s
Continuous integration / build (push) Successful in 46s
Continuous integration / Test Suite (push) Successful in 1m49s
Continuous integration / Check (pull_request) Successful in 48s
Continuous integration / Rustfmt (pull_request) Successful in 32s
Continuous integration / Test Suite (pull_request) Successful in 2m2s
Continuous integration / build (pull_request) Successful in 7m15s
2025-02-11 01:31:42 +00:00
74219ad333 web: fix uuid dep
All checks were successful
Continuous integration / Check (push) Successful in 1m49s
Continuous integration / Rustfmt (push) Successful in 34s
Continuous integration / build (push) Successful in 1m19s
Continuous integration / Test Suite (push) Successful in 4m17s
2025-02-10 17:28:04 -08:00
2073b7b132 Changes necessary for latest cargo packages 2025-02-10 14:57:40 -08:00
58dae5df6f gitea: initial setup
Some checks failed
Continuous integration / Check (push) Failing after 2m48s
Continuous integration / Test Suite (push) Failing after 4m52s
Continuous integration / Rustfmt (push) Successful in 58s
Continuous integration / build (push) Failing after 4m58s
2025-02-09 18:34:07 -08:00
c89fc9b6d4 Merge pull request 'fix(deps): update rust crate mailparse to 0.16.0' (#28) from renovate/mailparse-0.x into master
Reviewed-on: #28
2025-02-09 15:33:57 -08:00
f7ab08c1e6 fix(deps): update rust crate mailparse to 0.16.0 2025-02-09 23:30:40 +00:00
221fead7dc cargo update 2025-02-09 14:54:13 -08:00
3491cb9593 Merge pull request 'fix(deps): update rust crate tokio to v1.43.0' (#24) from renovate/tokio-1.x-lockfile into master
Reviewed-on: #24
2025-02-09 14:52:02 -08:00
037b3231ac fix(deps): update rust crate tokio to v1.43.0 2025-02-09 22:45:44 +00:00
75f38c1e94 Merge pull request 'fix(deps): update rust crate scraper to 0.22.0' (#23) from renovate/scraper-0.x into master
Reviewed-on: #23
2025-02-09 14:30:43 -08:00
977bcd0bf4 Merge pull request 'fix(deps): update rust crate itertools to 0.14.0' (#22) from renovate/itertools-0.x into master
Reviewed-on: #22
2025-02-09 14:30:33 -08:00
838459e5a8 Merge pull request 'fix(deps): update rust crate graphql_client to 0.14.0' (#21) from renovate/graphql_client-0.x into master
Reviewed-on: #21
2025-02-09 14:30:21 -08:00
d208a31348 Merge pull request 'fix(deps): update rust crate gloo-net to 0.6.0' (#20) from renovate/gloo-net-0.x into master
Reviewed-on: #20
2025-02-09 14:30:12 -08:00
0a640bea6f Merge pull request 'fix(deps): update rust crate css-inline to 0.14.0' (#19) from renovate/css-inline-0.x into master
Reviewed-on: #19
2025-02-09 14:30:02 -08:00
84a2962561 Merge pull request 'chore(deps): update dependency font-awesome to v6.7.2' (#18) from renovate/font-awesome-6.x into master
Reviewed-on: #18
2025-02-09 14:29:49 -08:00
6c71be7a3a Merge pull request 'fix(deps): update rust crate xtracing to v0.1.3' (#16) from renovate/xtracing-0.x-lockfile into master
Reviewed-on: #16
2025-02-09 14:29:36 -08:00
77562505b4 Merge pull request 'fix(deps): update rust crate sqlx to v0.8.3' (#15) from renovate/sqlx-0.x-lockfile into master
Reviewed-on: #15
2025-02-09 14:29:24 -08:00
c83d3dcf1d Merge pull request 'fix(deps): update rust crate serde_json to v1.0.138' (#14) from renovate/serde_json-1.x-lockfile into master
Reviewed-on: #14
2025-02-09 14:29:03 -08:00
081077d2c2 Merge pull request 'fix(deps): update rust crate serde to v1.0.217' (#13) from renovate/serde-monorepo into master
Reviewed-on: #13
2025-02-09 14:28:53 -08:00
4cfc6a73fc Merge pull request 'fix(deps): update rust crate log to v0.4.25' (#11) from renovate/log-0.x-lockfile into master
Reviewed-on: #11
2025-02-09 14:28:43 -08:00
f1c132830f Merge pull request 'fix(deps): update rust crate clap to v4.5.28' (#10) from renovate/clap-4.x-lockfile into master
Reviewed-on: #10
2025-02-09 14:28:30 -08:00
5aff7c6e85 Merge pull request 'fix(deps): update rust crate cacher to v0.1.4' (#9) from renovate/cacher-0.x-lockfile into master
Reviewed-on: #9
2025-02-09 14:28:19 -08:00
2c09713e20 Merge pull request 'fix(deps): update rust crate async-trait to v0.1.86' (#7) from renovate/async-trait-0.x-lockfile into master
Reviewed-on: #7
2025-02-09 14:28:07 -08:00
3d544feeb5 Merge pull request 'fix(deps): update rust crate ammonia to v4' (#25) from renovate/ammonia-4.x into master
Reviewed-on: #25
2025-02-09 13:57:23 -08:00
5830ed0bb1 Merge branch 'master' into renovate/ammonia-4.x 2025-02-09 13:57:13 -08:00
83aed683f5 fix(deps): update rust crate sqlx to v0.8.3 2025-02-09 21:15:54 +00:00
72385b3987 Merge pull request 'fix(deps): update rust crate lol_html to v2' (#26) from renovate/lol_html-2.x into master
Reviewed-on: #26
2025-02-09 13:01:28 -08:00
f21893b52e Bumping version to 0.0.138 2025-02-09 12:52:36 -08:00
0b81529509 build-info: one last version bump 2025-02-09 12:52:23 -08:00
9790bbea83 Bumping version to 0.0.137 2025-02-09 12:49:53 -08:00
7aa620a9da Update all build-info versions to fix build 2025-02-09 12:49:25 -08:00
2e67db0b4e fix(deps): update rust crate css-inline to 0.14.0 2025-02-09 20:30:48 +00:00
cd777b2894 fix(deps): update rust crate lol_html to v2 2025-02-09 20:17:15 +00:00
049e9728a2 fix(deps): update rust crate ammonia to v4 2025-02-09 20:17:10 +00:00
0952cdf9cb fix(deps): update rust crate scraper to 0.22.0 2025-02-09 20:16:59 +00:00
5f4a4e81cb fix(deps): update rust crate itertools to 0.14.0 2025-02-09 20:16:54 +00:00
38c2c508e8 fix(deps): update rust crate graphql_client to 0.14.0 2025-02-09 20:16:48 +00:00
4cd3664e32 fix(deps): update rust crate gloo-net to 0.6.0 2025-02-09 20:16:44 +00:00
71996f6c48 chore(deps): update dependency font-awesome to v6.7.2 2025-02-09 20:16:33 +00:00
6e227de00f fix(deps): update rust crate xtracing to v0.1.3 2025-02-09 20:16:24 +00:00
3576e67af7 Merge pull request 'fix(deps): update rust crate reqwest to v0.12.12' (#12) from renovate/reqwest-0.x-lockfile into master
Reviewed-on: #12
2025-02-09 12:16:14 -08:00
19f0f60653 fix(deps): update rust crate serde_json to v1.0.138 2025-02-09 20:16:12 +00:00
3502eeb711 fix(deps): update rust crate serde to v1.0.217 2025-02-09 20:16:02 +00:00
fd770d03ab fix(deps): update rust crate reqwest to v0.12.12 2025-02-09 20:15:54 +00:00
d99b7ae34c fix(deps): update rust crate log to v0.4.25 2025-02-09 20:15:48 +00:00
f18aa8c8d4 fix(deps): update rust crate clap to v4.5.28 2025-02-09 20:15:35 +00:00
dcdcb5b5a3 fix(deps): update rust crate cacher to v0.1.4 2025-02-09 20:15:31 +00:00
884e4b5831 fix(deps): update rust crate async-trait to v0.1.86 2025-02-09 20:15:19 +00:00
5981356492 Merge pull request 'fix(deps): update rust crate async-graphql-rocket to v7.0.15' (#6) from renovate/async-graphql-rocket-7.x-lockfile into master
Reviewed-on: #6
2025-02-09 12:10:15 -08:00
386b6915c5 fix(deps): update rust crate async-graphql-rocket to v7.0.15 2025-02-09 20:09:39 +00:00
5a6f04536f Merge pull request 'chore(deps): update rust crate build-info-build to 0.0.39' (#2) from renovate/build-info-build-0.x into master
Reviewed-on: #2
2025-02-09 11:21:21 -08:00
ae1d9e6db7 Merge pull request 'fix(deps): update rust crate anyhow to v1.0.95' (#3) from renovate/anyhow-1.x-lockfile into master
Reviewed-on: #3
2025-02-09 11:21:03 -08:00
24d50c21f5 fix(deps): update rust crate anyhow to v1.0.95 2025-02-09 19:08:21 +00:00
b4d72da639 chore(deps): update rust crate build-info-build to 0.0.39 2025-02-09 19:08:17 +00:00
dacb258289 Merge pull request 'chore: Configure Renovate' (#1) from renovate/configure into master
Reviewed-on: #1
2025-02-09 11:06:35 -08:00
5c674d4603 Add renovate.json 2025-02-09 19:01:46 +00:00
2e9753e91d Bumping version to 0.0.136 2025-02-06 08:17:10 -08:00
971e1049c7 web: allow plaintext emails to wrap 2025-02-06 08:16:53 -08:00
11c76332f3 Bumping version to 0.0.135 2025-02-06 07:46:34 -08:00
52d03ae964 web: tweak figure bg color on hackaday 2025-02-06 07:46:13 -08:00
c4043f6c56 Bumping version to 0.0.134 2025-02-05 09:18:40 -08:00
dfbac38281 web: style blockquotes in emails 2025-02-05 09:18:05 -08:00
f857c38625 Bumping version to 0.0.133 2025-02-02 09:52:05 -08:00
23823cd85e web: provide CSS overrides for email matching news posts 2025-02-02 09:51:27 -08:00
30b5d0ff9f Bumping version to 0.0.132 2025-01-30 20:19:21 -08:00
60a3b1ef88 web: remove accidentally committed line 2025-01-30 20:18:36 -08:00
a46390d110 Bumping version to 0.0.131 2025-01-30 17:45:35 -08:00
5baac0c77a web: fix width overflow on mobile and maybe progress bar 2025-01-30 17:45:14 -08:00
e6181d41ed web: address a bunch of dead code lint 2025-01-30 15:24:11 -08:00
6a228cfd5e Bumping version to 0.0.130 2025-01-30 14:16:30 -08:00
8d81067206 cargo sqlx prepare 2025-01-30 14:16:29 -08:00
b2e47a9bd4 server: round-robin by site when indexing searches 2025-01-30 14:16:12 -08:00
4eaf50cde4 Bumping version to 0.0.129 2025-01-30 13:55:52 -08:00
f20afe5447 update sqlx prepare 2025-01-30 13:55:38 -08:00
53093f4cce Bumping version to 0.0.128 2025-01-30 13:52:55 -08:00
9324a34d31 cargo sqlx prepare 2025-01-30 13:52:54 -08:00
eecc4bc3ef server: strip style & script tags, also handle some retryable errors on slurp 2025-01-30 13:52:22 -08:00
795029cb06 Bumping version to 0.0.127 2025-01-29 17:25:55 -08:00
bc0135106f server: error when get request has a bad response code 2025-01-29 17:25:26 -08:00
bd2803f81c Bumping version to 0.0.126 2025-01-29 17:10:42 -08:00
215addc2c0 cargo sqlx prepare 2025-01-29 17:10:41 -08:00
69f8e24689 server: index newest news posts first 2025-01-29 17:10:26 -08:00
0817a7a51b Bumping version to 0.0.125 2025-01-29 17:04:16 -08:00
200933591a cargo sqlx prepare 2025-01-29 17:04:15 -08:00
8b7c819b17 server: only index 100 search summaries at a time 2025-01-29 17:03:47 -08:00
dce433ab5a Bumping version to 0.0.124 2025-01-29 16:53:59 -08:00
eb4f2d8b5d server: filter out bad urls when indexing search summary 2025-01-29 16:53:38 -08:00
2008457911 Bumping version to 0.0.123 2025-01-29 16:13:50 -08:00
f6b57e63fd cargo sqlx prepare 2025-01-29 16:13:50 -08:00
d681612e8e server: index all search summaries on refresh 2025-01-29 16:13:44 -08:00
80454cbc7e Bumping version to 0.0.122 2025-01-29 15:44:05 -08:00
78cf59333e cargo sqlx prepare 2025-01-29 15:44:04 -08:00
ab47f32b52 server: fetch search summaries in parallel 2025-01-29 15:43:46 -08:00
d9d58afed9 Bumping version to 0.0.121 2025-01-29 15:24:55 -08:00
d01f9a7e08 cargo sqlx prepare 2025-01-29 15:24:54 -08:00
c6aabf88b9 server: sample DB for missing indexes, should prevent duplication from separate threads 2025-01-29 14:42:59 -08:00
29bf6d9b6d Bumping version to 0.0.120 2025-01-29 14:08:55 -08:00
92bf45bd15 cargo sqlx prepare 2025-01-29 14:08:54 -08:00
12c8e0e33b server: use fetched contents of news for search index 2025-01-29 14:08:20 -08:00
c7aa32b922 Bumping version to 0.0.119 2025-01-28 09:34:56 -08:00
94be4ec572 web: add archive buttons, and adjust when text on buttons is shown 2025-01-28 09:34:36 -08:00
66c299bc4c Bumping version to 0.0.118 2025-01-27 15:48:12 -08:00
d5c4176392 cargo sqlx prepare 2025-01-27 15:48:11 -08:00
bd00542c28 server: use clean_summary field instead of summary 2025-01-27 15:47:55 -08:00
19f029cb6b Bumping version to 0.0.117 2025-01-27 14:15:00 -08:00
198db1492a server: add another The Onion slurp config 2025-01-27 14:14:46 -08:00
f6665b6b6e Bumping version to 0.0.116 2025-01-27 14:01:30 -08:00
ee93d725ba web & server: finish initial tailwind rewrite 2025-01-27 14:00:46 -08:00
70fb635eda server: index on nzb_posts created_at, attempt to speed up homepage 2025-01-27 13:18:36 -08:00
b9fbefe05c server: format chrome css 2025-01-27 13:17:22 -08:00
46f823baae server: use local slurp cache separate from production 2025-01-27 13:16:55 -08:00
cc1e998ec5 web: style version chart 2025-01-26 16:01:35 -08:00
fb73d8272e web: update style for rendering emails, including attachments 2025-01-26 15:56:08 -08:00
87321fb669 web: update stylings for removable tag chiclets 2025-01-26 14:02:39 -08:00
44b60d5070 web: style checkboxes, tweak mobile search bar width 2025-01-26 13:42:20 -08:00
89897aa48f web: style search toolbar 2025-01-26 12:24:06 -08:00
b2879211e4 web: much nicer tag list styling with flex box 2025-01-26 10:58:27 -08:00
49 changed files with 3905 additions and 2168 deletions

67
.gitea/workflows/rust.yml Normal file
View File

@@ -0,0 +1,67 @@
on: [push]
name: Continuous integration
jobs:
check:
name: Check
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions-rust-lang/setup-rust-toolchain@v1
- run: cargo check
test:
name: Test Suite
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions-rust-lang/setup-rust-toolchain@v1
- run: cargo test
trunk:
name: Trunk
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions-rust-lang/setup-rust-toolchain@v1
with:
toolchain: stable
target: wasm32-unknown-unknown
- run: cargo install trunk
- run: cd web; trunk build
fmt:
name: Rustfmt
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions-rust-lang/setup-rust-toolchain@v1
with:
components: rustfmt
- name: Rustfmt Check
uses: actions-rust-lang/rustfmt@v1
build:
name: build
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions-rust-lang/setup-rust-toolchain@v1
- run: cargo build
udeps:
name: Disallow unused dependencies
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions-rust-lang/setup-rust-toolchain@v1
with:
toolchain: nightly
- name: Run cargo-udeps
uses: aig787/cargo-udeps-action@v1
with:
version: 'latest'
args: '--all-targets'

1988
Cargo.lock generated

File diff suppressed because it is too large Load Diff

View File

@@ -3,6 +3,14 @@ resolver = "2"
default-members = ["server"] default-members = ["server"]
members = ["web", "server", "notmuch", "procmail2notmuch", "shared"] members = ["web", "server", "notmuch", "procmail2notmuch", "shared"]
[workspace.package]
authors = ["Bill Thiede <git@xinu.tv>"]
edition = "2021"
license = "UNLICENSED"
publish = ["xinu"]
version = "0.10.0"
repository = "https://git.z.xinu.tv/wathiede/letterbox"
[profile.dev] [profile.dev]
opt-level = 1 opt-level = 1

19
Justfile Normal file
View File

@@ -0,0 +1,19 @@
export CARGO_INCREMENTAL := "0"
export RUSTFLAGS := "-D warnings"
default:
@echo "Run: just patch|minor|major"
major: (_release "major")
minor: (_release "minor")
patch: (_release "patch")
sqlx-prepare:
cd server; cargo sqlx prepare && git add .sqlx; git commit -m "cargo sqlx prepare" .sqlx || true
pull:
git pull
_release level: pull sqlx-prepare
cargo-release release -x {{ level }} --workspace --no-confirm --registry=xinu

View File

@@ -1,18 +1,24 @@
[package] [package]
name = "notmuch" name = "letterbox-notmuch"
version = "0.0.115" exclude = ["/testdata"]
edition = "2021" description = "Wrapper for calling notmuch cli"
authors.workspace = true
edition.workspace = true
license.workspace = true
publish.workspace = true
repository.workspace = true
version.workspace = true
# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html
[dependencies] [dependencies]
log = "0.4.14" log = "0.4.14"
mailparse = "0.16.0"
serde = { version = "1.0", features = ["derive"] } serde = { version = "1.0", features = ["derive"] }
serde_json = { version = "1.0", features = ["unbounded_depth"] } serde_json = { version = "1.0", features = ["unbounded_depth"] }
thiserror = "1.0.30" thiserror = "2.0.0"
tracing = "0.1.41" tracing = "0.1.41"
[dev-dependencies] [dev-dependencies]
itertools = "0.10.1" itertools = "0.14.0"
pretty_assertions = "1" pretty_assertions = "1"
rayon = "1.5" rayon = "1.5"

View File

@@ -207,6 +207,7 @@
//! ``` //! ```
use std::{ use std::{
collections::HashMap,
ffi::OsStr, ffi::OsStr,
io::{self}, io::{self},
path::{Path, PathBuf}, path::{Path, PathBuf},
@@ -270,6 +271,12 @@ pub struct Headers {
#[serde(skip_serializing_if = "Option::is_none")] #[serde(skip_serializing_if = "Option::is_none")]
pub bcc: Option<String>, pub bcc: Option<String>,
#[serde(skip_serializing_if = "Option::is_none")] #[serde(skip_serializing_if = "Option::is_none")]
#[serde(alias = "Delivered-To")]
pub delivered_to: Option<String>,
#[serde(skip_serializing_if = "Option::is_none")]
#[serde(alias = "X-Original-To")]
pub x_original_to: Option<String>,
#[serde(skip_serializing_if = "Option::is_none")]
pub reply_to: Option<String>, pub reply_to: Option<String>,
pub date: String, pub date: String,
} }
@@ -459,6 +466,8 @@ pub enum NotmuchError {
StringUtf8Error(#[from] std::string::FromUtf8Error), StringUtf8Error(#[from] std::string::FromUtf8Error),
#[error("failed to parse str as int")] #[error("failed to parse str as int")]
ParseIntError(#[from] std::num::ParseIntError), ParseIntError(#[from] std::num::ParseIntError),
#[error("failed to parse mail: {0}")]
MailParseError(#[from] mailparse::MailParseError),
} }
#[derive(Default)] #[derive(Default)]
@@ -605,6 +614,80 @@ impl Notmuch {
Ok(serde_json::from_slice(&res)?) Ok(serde_json::from_slice(&res)?)
} }
#[instrument(skip_all)]
pub fn unread_recipients(&self) -> Result<HashMap<String, usize>, NotmuchError> {
let slice = self.run_notmuch([
"show",
"--include-html=false",
"--entire-thread=false",
"--body=false",
"--format=json",
// Arbitrary limit to prevent too much work
"--limit=1000",
"is:unread",
])?;
// Notmuch returns JSON with invalid unicode. So we lossy convert it to a string here and
// use that for parsing in rust.
let s = String::from_utf8_lossy(&slice);
let mut deserializer = serde_json::Deserializer::from_str(&s);
deserializer.disable_recursion_limit();
let ts: ThreadSet = serde::de::Deserialize::deserialize(&mut deserializer)?;
deserializer.end()?;
let mut r = HashMap::new();
fn collect_from_thread_node(
r: &mut HashMap<String, usize>,
tn: &ThreadNode,
) -> Result<(), NotmuchError> {
let Some(msg) = &tn.0 else {
return Ok(());
};
let mut addrs = vec![];
let hdr = &msg.headers.to;
if let Some(to) = hdr {
addrs.push(to);
} else {
let hdr = &msg.headers.x_original_to;
if let Some(to) = hdr {
addrs.push(to);
} else {
let hdr = &msg.headers.delivered_to;
if let Some(to) = hdr {
addrs.push(to);
};
};
};
let hdr = &msg.headers.cc;
if let Some(cc) = hdr {
addrs.push(cc);
};
for recipient in addrs {
mailparse::addrparse(&recipient)?
.into_inner()
.iter()
.for_each(|a| {
let mailparse::MailAddr::Single(si) = a else {
return;
};
let addr = &si.addr;
if addr == "couchmoney@gmail.com" || addr.ends_with("@xinu.tv") {
*r.entry(addr.to_lowercase()).or_default() += 1;
}
});
}
Ok(())
}
for t in ts.0 {
for tn in t.0 {
collect_from_thread_node(&mut r, &tn)?;
for sub_tn in tn.1 {
collect_from_thread_node(&mut r, &sub_tn)?;
}
}
}
Ok(r)
}
fn run_notmuch<I, S>(&self, args: I) -> Result<Vec<u8>, NotmuchError> fn run_notmuch<I, S>(&self, args: I) -> Result<Vec<u8>, NotmuchError>
where where
I: IntoIterator<Item = S>, I: IntoIterator<Item = S>,

View File

@@ -4,7 +4,7 @@ use std::{
time::Instant, time::Instant,
}; };
use notmuch::Notmuch; use letterbox_notmuch::Notmuch;
use rayon::iter::{ParallelBridge, ParallelIterator}; use rayon::iter::{ParallelBridge, ParallelIterator};
#[test] #[test]

View File

@@ -1,7 +1,12 @@
[package] [package]
name = "procmail2notmuch" name = "letterbox-procmail2notmuch"
version = "0.0.115" description = "Tool for generating notmuch rules from procmail"
edition = "2021" authors.workspace = true
edition.workspace = true
license.workspace = true
publish.workspace = true
repository.workspace = true
version.workspace = true
# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html # See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html

6
renovate.json Normal file
View File

@@ -0,0 +1,6 @@
{
"$schema": "https://docs.renovatebot.com/renovate-schema.json",
"extends": [
"config:recommended"
]
}

View File

@@ -1,6 +1,6 @@
{ {
"db_name": "PostgreSQL", "db_name": "PostgreSQL",
"query": "SELECT\n date,\n is_read,\n link,\n site,\n summary,\n title,\n name,\n homepage\nFROM\n post p\n JOIN feed f ON p.site = f.slug\nWHERE\n uid = $1\n", "query": "SELECT\n date,\n is_read,\n link,\n site,\n summary,\n clean_summary,\n title,\n name,\n homepage\nFROM\n post AS p\nINNER JOIN feed AS f ON p.site = f.slug\nWHERE\n uid = $1\n",
"describe": { "describe": {
"columns": [ "columns": [
{ {
@@ -30,16 +30,21 @@
}, },
{ {
"ordinal": 5, "ordinal": 5,
"name": "title", "name": "clean_summary",
"type_info": "Text" "type_info": "Text"
}, },
{ {
"ordinal": 6, "ordinal": 6,
"name": "name", "name": "title",
"type_info": "Text" "type_info": "Text"
}, },
{ {
"ordinal": 7, "ordinal": 7,
"name": "name",
"type_info": "Text"
},
{
"ordinal": 8,
"name": "homepage", "name": "homepage",
"type_info": "Text" "type_info": "Text"
} }
@@ -57,8 +62,9 @@
true, true,
true, true,
true, true,
true,
true true
] ]
}, },
"hash": "113694cd5bf0d2582ff3a635776daa608fe88abe1185958c4215646c92335afb" "hash": "383221a94bc3746322ba78e41cde37994440ee67dc32e88d2394c51211bde6cd"
} }

View File

@@ -0,0 +1,32 @@
{
"db_name": "PostgreSQL",
"query": "SELECT\n p.id,\n link,\n clean_summary\nFROM\n post AS p\nINNER JOIN feed AS f ON p.site = f.slug -- necessary to weed out nzb posts\nWHERE\n search_summary IS NULL\n -- TODO remove AND link ~ '^<'\nORDER BY\n ROW_NUMBER() OVER (PARTITION BY site ORDER BY date DESC)\nLIMIT 100;\n",
"describe": {
"columns": [
{
"ordinal": 0,
"name": "id",
"type_info": "Int4"
},
{
"ordinal": 1,
"name": "link",
"type_info": "Text"
},
{
"ordinal": 2,
"name": "clean_summary",
"type_info": "Text"
}
],
"parameters": {
"Left": []
},
"nullable": [
false,
false,
true
]
},
"hash": "3d271b404f06497a5dcde68cf6bf07291d70fa56058ea736ac24e91d33050c04"
}

View File

@@ -0,0 +1,24 @@
{
"db_name": "PostgreSQL",
"query": "SELECT COUNT(*) AS count\nFROM\n post\nWHERE\n (\n $1::text IS NULL\n OR site = $1\n )\n AND (\n NOT $2\n OR NOT is_read\n )\n AND (\n $3::text IS NULL\n OR TO_TSVECTOR('english', search_summary)\n @@ WEBSEARCH_TO_TSQUERY('english', $3)\n )\n",
"describe": {
"columns": [
{
"ordinal": 0,
"name": "count",
"type_info": "Int8"
}
],
"parameters": {
"Left": [
"Text",
"Bool",
"Text"
]
},
"nullable": [
null
]
},
"hash": "8c1b3c78649135e98b89092237750088433f7ff1b7c2ddeedec553406ea9f203"
}

View File

@@ -1,24 +0,0 @@
{
"db_name": "PostgreSQL",
"query": "SELECT\n COUNT(*) count\nFROM\n post\nWHERE\n (\n $1 :: text IS NULL\n OR site = $1\n )\n AND (\n NOT $2\n OR NOT is_read\n )\n AND (\n $3 :: text IS NULL\n OR to_tsvector('english', summary) @@ websearch_to_tsquery('english', $3)\n )\n",
"describe": {
"columns": [
{
"ordinal": 0,
"name": "count",
"type_info": "Int8"
}
],
"parameters": {
"Left": [
"Text",
"Bool",
"Text"
]
},
"nullable": [
null
]
},
"hash": "e118f546c628661023aa25803bb29affb6cd25eca63246e5ace5b90a845d76ac"
}

View File

@@ -0,0 +1,15 @@
{
"db_name": "PostgreSQL",
"query": "UPDATE post SET search_summary = $1 WHERE id = $2",
"describe": {
"columns": [],
"parameters": {
"Left": [
"Text",
"Int4"
]
},
"nullable": []
},
"hash": "ef8327f039dbfa8f4e59b7a77a6411252a346bf51cf940024a17d9fbb2df173c"
}

View File

@@ -1,6 +1,6 @@
{ {
"db_name": "PostgreSQL", "db_name": "PostgreSQL",
"query": "SELECT\n site,\n date,\n is_read,\n title,\n uid,\n name\nFROM\n post p\n JOIN feed f ON p.site = f.slug\nWHERE\n ($1::text IS NULL OR site = $1)\n AND (\n NOT $2\n OR NOT is_read\n )\n AND (\n $5 :: text IS NULL\n OR to_tsvector('english', summary) @@ websearch_to_tsquery('english', $5)\n )\nORDER BY\n date DESC,\n title OFFSET $3\nLIMIT\n $4\n", "query": "SELECT\n site,\n date,\n is_read,\n title,\n uid,\n name\nFROM\n post p\n JOIN feed f ON p.site = f.slug\nWHERE\n ($1::text IS NULL OR site = $1)\n AND (\n NOT $2\n OR NOT is_read\n )\n AND (\n $5 :: text IS NULL\n OR to_tsvector('english', search_summary) @@ websearch_to_tsquery('english', $5)\n )\nORDER BY\n date DESC,\n title OFFSET $3\nLIMIT\n $4\n",
"describe": { "describe": {
"columns": [ "columns": [
{ {
@@ -52,5 +52,5 @@
true true
] ]
}, },
"hash": "99114d4840067acb12d9a41ef036bdd8ecf87cfdde8ce4985821485816af5213" "hash": "fc4607f02cc76a5f3a6629cce4507c74f52ae44820897b47365da3f339d1da06"
} }

View File

@@ -1,51 +1,58 @@
[package] [package]
name = "letterbox-server" name = "letterbox-server"
version = "0.0.115"
edition = "2021"
default-run = "letterbox-server" default-run = "letterbox-server"
description = "Backend for letterbox"
authors.workspace = true
edition.workspace = true
license.workspace = true
publish.workspace = true
repository.workspace = true
version.workspace = true
# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html # See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html
[dependencies] [dependencies]
ammonia = "3.3.0" ammonia = "4.0.0"
anyhow = "1.0.79" anyhow = "1.0.79"
async-graphql = { version = "7", features = ["log"] } async-graphql = { version = "7", features = ["log"] }
async-graphql-rocket = "7" async-graphql-rocket = "7"
async-trait = "0.1.81" async-trait = "0.1.81"
build-info = "0.0.38" build-info = "0.0.40"
cacher = { version = "0.1.0", registry = "xinu" } cacher = { version = "0.1.0", registry = "xinu" }
chrono = "0.4.39" chrono = "0.4.39"
clap = { version = "4.5.23", features = ["derive"] } clap = { version = "4.5.23", features = ["derive"] }
css-inline = "0.13.0" css-inline = "0.14.0"
futures = "0.3.31"
html-escape = "0.2.13" html-escape = "0.2.13"
linkify = "0.10.0" linkify = "0.10.0"
log = "0.4.17" log = "0.4.17"
lol_html = "1.2.0" lol_html = "2.0.0"
mailparse = "0.15.0" mailparse = "0.16.0"
maplit = "1.0.2" maplit = "1.0.2"
memmap = "0.7.0" memmap = "0.7.0"
notmuch = { path = "../notmuch" } opentelemetry = "0.28.0"
opentelemetry = "0.27.1" regex = "1.11.1"
reqwest = { version = "0.12.7", features = ["blocking"] } reqwest = { version = "0.12.7", features = ["blocking"] }
rocket = { version = "0.5.0-rc.2", features = ["json"] } rocket = { version = "0.5.0-rc.2", features = ["json"] }
rocket_cors = "0.6.0" rocket_cors = "0.6.0"
scraper = "0.20.0" scraper = "0.23.0"
serde = { version = "1.0.147", features = ["derive"] } serde = { version = "1.0.147", features = ["derive"] }
serde_json = "1.0.87" serde_json = "1.0.87"
shared = { path = "../shared" }
sqlx = { version = "0.8.2", features = ["postgres", "runtime-tokio", "time"] } sqlx = { version = "0.8.2", features = ["postgres", "runtime-tokio", "time"] }
tantivy = { version = "0.22.0", optional = true } tantivy = { version = "0.22.0", optional = true }
thiserror = "1.0.37" thiserror = "2.0.0"
tokio = "1.26.0" tokio = "1.26.0"
tracing = "0.1.41" tracing = "0.1.41"
url = "2.5.2" url = "2.5.2"
urlencoding = "2.1.3" urlencoding = "2.1.3"
#xtracing = { path = "../../xtracing" } #xtracing = { path = "../../xtracing" }
#xtracing = { git = "http://git-private.h.xinu.tv/wathiede/xtracing.git" } #xtracing = { git = "http://git-private.h.xinu.tv/wathiede/xtracing.git" }
xtracing = { version = "0.1.0", registry = "xinu" } xtracing = { version = "0.3.0", registry = "xinu" }
letterbox-notmuch = { version = "0.10.0", path = "../notmuch", registry = "xinu" }
letterbox-shared = { version = "0.10.0", path = "../shared", registry = "xinu" }
[build-dependencies] [build-dependencies]
build-info-build = "0.0.38" build-info-build = "0.0.40"
[features] [features]
#default = [ "tantivy" ] #default = [ "tantivy" ]

View File

@@ -11,4 +11,4 @@ port = 9345
#log_level = "critical" #log_level = "critical"
newsreader_database_url = "postgres://newsreader@nixos-07.h.xinu.tv/newsreader" newsreader_database_url = "postgres://newsreader@nixos-07.h.xinu.tv/newsreader"
newsreader_tantivy_db_path = "../target/database/newsreader" newsreader_tantivy_db_path = "../target/database/newsreader"
slurp_cache_path = "/net/nasx/x/letterbox/slurp" slurp_cache_path = "/tmp/letterbox/slurp"

View File

@@ -0,0 +1,2 @@
-- Add down migration script here
DROP INDEX nzb_posts_created_at_idx;

View File

@@ -0,0 +1,2 @@
-- Add up migration script here
CREATE INDEX nzb_posts_created_at_idx ON nzb_posts USING btree (created_at);

View File

@@ -0,0 +1,15 @@
-- Add down migration script here
BEGIN;
DROP INDEX IF EXISTS post_search_summary_idx;
ALTER TABLE post DROP search_summary;
-- CREATE INDEX post_summary_idx ON post USING gin (to_tsvector(
-- 'english',
-- regexp_replace(
-- regexp_replace(summary, '<[^>]+>', ' ', 'g'),
-- '\s+',
-- ' ',
-- 'g'
-- )
-- ));
COMMIT;

View File

@@ -0,0 +1,14 @@
-- Add up migration script here
BEGIN;
DROP INDEX IF EXISTS post_summary_idx;
ALTER TABLE post ADD search_summary TEXT;
CREATE INDEX post_search_summary_idx ON post USING gin (
to_tsvector('english', search_summary)
);
UPDATE post SET search_summary = regexp_replace(
regexp_replace(summary, '<[^>]+>', ' ', 'g'),
'\s+',
' ',
'g'
);
COMMIT;

View File

@@ -1,10 +1,9 @@
SELECT SELECT COUNT(*) AS count
COUNT(*) count
FROM FROM
post post
WHERE WHERE
( (
$1 :: text IS NULL $1::text IS NULL
OR site = $1 OR site = $1
) )
AND ( AND (
@@ -12,6 +11,7 @@ WHERE
OR NOT is_read OR NOT is_read
) )
AND ( AND (
$3 :: text IS NULL $3::text IS NULL
OR to_tsvector('english', summary) @@ websearch_to_tsquery('english', $3) OR TO_TSVECTOR('english', search_summary)
@@ WEBSEARCH_TO_TSQUERY('english', $3)
) )

View File

@@ -0,0 +1,13 @@
SELECT
p.id,
link,
clean_summary
FROM
post AS p
INNER JOIN feed AS f ON p.site = f.slug -- necessary to weed out nzb posts
WHERE
search_summary IS NULL
-- TODO remove AND link ~ '^<'
ORDER BY
ROW_NUMBER() OVER (PARTITION BY site ORDER BY date DESC)
LIMIT 100;

View File

@@ -4,11 +4,12 @@ SELECT
link, link,
site, site,
summary, summary,
clean_summary,
title, title,
name, name,
homepage homepage
FROM FROM
post p post AS p
JOIN feed f ON p.site = f.slug INNER JOIN feed AS f ON p.site = f.slug
WHERE WHERE
uid = $1 uid = $1

View File

@@ -16,7 +16,7 @@ WHERE
) )
AND ( AND (
$5 :: text IS NULL $5 :: text IS NULL
OR to_tsvector('english', summary) @@ websearch_to_tsquery('english', $5) OR to_tsvector('english', search_summary) @@ websearch_to_tsquery('english', $5)
) )
ORDER BY ORDER BY
date DESC, date DESC,

View File

@@ -0,0 +1,13 @@
select t.id, tt.tokid, tt.alias, length(t.token), t.token from (
select id, (ts_parse('default',
-- regexp_replace(
-- regexp_replace(summary, '<[^>]+>', ' ', 'g'),
-- '\s+',
-- ' ',
-- 'g'
-- )
summary
)).* from post) t
inner join ts_token_type('default') tt
on t.tokid = tt.tokid
where length(token) >= 2*1024;

View File

@@ -0,0 +1,21 @@
use std::fs;
use url::Url;
fn main() -> anyhow::Result<()> {
println!("PWD: {}", std::env::current_dir()?.display());
let _url = "https://slashdot.org/story/25/01/24/1813201/walgreens-replaced-fridge-doors-with-smart-screens-its-now-a-200-million-fiasco?utm_source=rss1.0mainlinkanon&utm_medium=feed";
let _url = "https://hackaday.com/2025/01/24/hackaday-podcast-episode-305-caustic-clocks-practice-bones-and-brick-layers/";
let _url = "https://theonion.com/monster-devastated-to-see-film-depicting-things-he-told-guillermo-del-toro-in-confidence/";
let _url = "https://trofi.github.io/posts/330-another-nix-language-nondeterminism-example.html";
let _url = "https://blog.cloudflare.com/ddos-threat-report-for-2024-q4/";
let url = "https://trofi.github.io/posts/330-another-nix-language-nondeterminism-example.html";
let body = reqwest::blocking::get(url)?.text()?;
let output = "/tmp/h2md/output.html";
let inliner = css_inline::CSSInliner::options()
.base_url(Url::parse(url).ok())
.build();
let inlined = inliner.inline(&body)?;
fs::write(output, inlined)?;
Ok(())
}

View File

@@ -7,6 +7,8 @@ use std::{error::Error, io::Cursor, str::FromStr};
use async_graphql::{extensions, http::GraphiQLSource, EmptySubscription, Schema}; use async_graphql::{extensions, http::GraphiQLSource, EmptySubscription, Schema};
use async_graphql_rocket::{GraphQLQuery, GraphQLRequest, GraphQLResponse}; use async_graphql_rocket::{GraphQLQuery, GraphQLRequest, GraphQLResponse};
use cacher::FilesystemCacher;
use letterbox_notmuch::{Notmuch, NotmuchError, ThreadSet};
#[cfg(feature = "tantivy")] #[cfg(feature = "tantivy")]
use letterbox_server::tantivy::TantivyConnection; use letterbox_server::tantivy::TantivyConnection;
use letterbox_server::{ use letterbox_server::{
@@ -15,7 +17,6 @@ use letterbox_server::{
graphql::{Attachment, GraphqlSchema, Mutation, QueryRoot}, graphql::{Attachment, GraphqlSchema, Mutation, QueryRoot},
nm::{attachment_bytes, cid_attachment_bytes}, nm::{attachment_bytes, cid_attachment_bytes},
}; };
use notmuch::{Notmuch, NotmuchError, ThreadSet};
use rocket::{ use rocket::{
fairing::AdHoc, fairing::AdHoc,
http::{ContentType, Header}, http::{ContentType, Header},
@@ -178,7 +179,7 @@ async fn graphql_request(
async fn main() -> Result<(), Box<dyn Error>> { async fn main() -> Result<(), Box<dyn Error>> {
let _guard = xtracing::init(env!("CARGO_BIN_NAME"))?; let _guard = xtracing::init(env!("CARGO_BIN_NAME"))?;
build_info::build_info!(fn bi); build_info::build_info!(fn bi);
info!("Build Info: {}", shared::build_version(bi)); info!("Build Info: {}", letterbox_shared::build_version(bi));
let allowed_origins = AllowedOrigins::all(); let allowed_origins = AllowedOrigins::all();
let cors = rocket_cors::CorsOptions { let cors = rocket_cors::CorsOptions {
allowed_origins, allowed_origins,
@@ -194,7 +195,7 @@ async fn main() -> Result<(), Box<dyn Error>> {
let rkt = rocket::build() let rkt = rocket::build()
.mount( .mount(
shared::urls::MOUNT_POINT, letterbox_shared::urls::MOUNT_POINT,
routes![ routes![
original, original,
show_pretty, show_pretty,
@@ -220,9 +221,10 @@ async fn main() -> Result<(), Box<dyn Error>> {
#[cfg(feature = "tantivy")] #[cfg(feature = "tantivy")]
let tantivy_conn = TantivyConnection::new(&config.newsreader_tantivy_db_path)?; let tantivy_conn = TantivyConnection::new(&config.newsreader_tantivy_db_path)?;
let cacher = FilesystemCacher::new(&config.slurp_cache_path)?;
let schema = Schema::build(QueryRoot, Mutation, EmptySubscription) let schema = Schema::build(QueryRoot, Mutation, EmptySubscription)
.data(Notmuch::default()) .data(Notmuch::default())
.data(config) .data(cacher)
.data(pool.clone()); .data(pool.clone());
#[cfg(feature = "tantivy")] #[cfg(feature = "tantivy")]

File diff suppressed because it is too large Load Diff

View File

@@ -1,8 +0,0 @@
pre {
background-color: var(--color-bg);
color: var(--color-text);
}
code {
background-color: var(--color-bg-secondary);
}

View File

@@ -10,7 +10,7 @@ use crate::TransformError;
#[derive(Error, Debug)] #[derive(Error, Debug)]
pub enum ServerError { pub enum ServerError {
#[error("notmuch: {0}")] #[error("notmuch: {0}")]
NotmuchError(#[from] notmuch::NotmuchError), NotmuchError(#[from] letterbox_notmuch::NotmuchError),
#[error("flatten")] #[error("flatten")]
FlattenError, FlattenError,
#[error("mail parse error: {0}")] #[error("mail parse error: {0}")]

View File

@@ -5,8 +5,9 @@ use async_graphql::{
Context, EmptySubscription, Enum, Error, FieldResult, InputObject, Object, Schema, Context, EmptySubscription, Enum, Error, FieldResult, InputObject, Object, Schema,
SimpleObject, Union, SimpleObject, Union,
}; };
use cacher::FilesystemCacher;
use letterbox_notmuch::Notmuch;
use log::info; use log::info;
use notmuch::Notmuch;
use serde::{Deserialize, Serialize}; use serde::{Deserialize, Serialize};
use sqlx::postgres::PgPool; use sqlx::postgres::PgPool;
use tokio::join; use tokio::join;
@@ -14,7 +15,7 @@ use tracing::instrument;
#[cfg(feature = "tantivy")] #[cfg(feature = "tantivy")]
use crate::tantivy::TantivyConnection; use crate::tantivy::TantivyConnection;
use crate::{config::Config, newsreader, nm, Query}; use crate::{newsreader, nm, Query};
/// # Number of seconds since the Epoch /// # Number of seconds since the Epoch
pub type UnixTime = isize; pub type UnixTime = isize;
@@ -94,6 +95,10 @@ pub struct Message {
pub to: Vec<Email>, pub to: Vec<Email>,
// All CC headers found in email // All CC headers found in email
pub cc: Vec<Email>, pub cc: Vec<Email>,
// X-Original-To header found in email
pub x_original_to: Option<Email>,
// Delivered-To header found in email
pub delivered_to: Option<Email>,
// First Subject header found in email // First Subject header found in email
pub subject: Option<String>, pub subject: Option<String>,
// Parsed Date header, if found and valid // Parsed Date header, if found and valid
@@ -282,10 +287,10 @@ pub struct QueryRoot;
impl QueryRoot { impl QueryRoot {
async fn version<'ctx>(&self, _ctx: &Context<'ctx>) -> Result<String, Error> { async fn version<'ctx>(&self, _ctx: &Context<'ctx>) -> Result<String, Error> {
build_info::build_info!(fn bi); build_info::build_info!(fn bi);
Ok(shared::build_version(bi)) Ok(letterbox_shared::build_version(bi))
} }
#[instrument(skip_all, fields(query=query))] #[instrument(skip_all, fields(query=query))]
#[instrument(skip_all, fields(query=query, request_id=request_id()))] #[instrument(skip_all, fields(query=query, rid=request_id()))]
async fn count<'ctx>(&self, ctx: &Context<'ctx>, query: String) -> Result<usize, Error> { async fn count<'ctx>(&self, ctx: &Context<'ctx>, query: String) -> Result<usize, Error> {
let nm = ctx.data_unchecked::<Notmuch>(); let nm = ctx.data_unchecked::<Notmuch>();
let pool = ctx.data_unchecked::<PgPool>(); let pool = ctx.data_unchecked::<PgPool>();
@@ -305,10 +310,49 @@ impl QueryRoot {
info!("count {newsreader_query:?} newsreader count {newsreader_count} notmuch count {notmuch_count} tantivy count {tantivy_count} total {total}"); info!("count {newsreader_query:?} newsreader count {newsreader_count} notmuch count {notmuch_count} tantivy count {tantivy_count} total {total}");
Ok(total) Ok(total)
} }
async fn catchup<'ctx>(
&self,
ctx: &Context<'ctx>,
query: String,
) -> Result<Vec<String>, Error> {
let nm = ctx.data_unchecked::<Notmuch>();
let pool = ctx.data_unchecked::<PgPool>();
let query: Query = query.parse()?;
// TODO: implement optimized versions of fetching just IDs
let newsreader_fut = newsreader_search(pool, None, None, None, None, &query);
let notmuch_fut = notmuch_search(nm, None, None, None, None, &query);
let (newsreader_results, notmuch_results) = join!(newsreader_fut, notmuch_fut);
let newsreader_results = newsreader_results?;
let notmuch_results = notmuch_results?;
info!(
"newsreader_results ({}) notmuch_results ({})",
newsreader_results.len(),
notmuch_results.len(),
);
let mut results: Vec<_> = newsreader_results
.into_iter()
.chain(notmuch_results)
.collect();
// The leading '-' is to reverse sort
results.sort_by_key(|item| match item {
ThreadSummaryCursor::Newsreader(_, ts) => -ts.timestamp,
ThreadSummaryCursor::Notmuch(_, ts) => -ts.timestamp,
});
let ids = results
.into_iter()
.map(|r| match r {
ThreadSummaryCursor::Newsreader(_, ts) => ts.thread,
ThreadSummaryCursor::Notmuch(_, ts) => ts.thread,
})
.collect();
Ok(ids)
}
// TODO: this function doesn't get parallelism, possibly because notmuch is sync and blocks, // TODO: this function doesn't get parallelism, possibly because notmuch is sync and blocks,
// rewrite that with tokio::process:Command // rewrite that with tokio::process:Command
#[instrument(skip_all, fields(query=query, request_id=request_id()))] #[instrument(skip_all, fields(query=query, rid=request_id()))]
async fn search<'ctx>( async fn search<'ctx>(
&self, &self,
ctx: &Context<'ctx>, ctx: &Context<'ctx>,
@@ -466,7 +510,7 @@ impl QueryRoot {
.await?) .await?)
} }
#[instrument(skip_all, fields(request_id=request_id()))] #[instrument(skip_all, fields(rid=request_id()))]
async fn tags<'ctx>(&self, ctx: &Context<'ctx>) -> FieldResult<Vec<Tag>> { async fn tags<'ctx>(&self, ctx: &Context<'ctx>) -> FieldResult<Vec<Tag>> {
let nm = ctx.data_unchecked::<Notmuch>(); let nm = ctx.data_unchecked::<Notmuch>();
let pool = ctx.data_unchecked::<PgPool>(); let pool = ctx.data_unchecked::<PgPool>();
@@ -475,11 +519,11 @@ impl QueryRoot {
tags.append(&mut nm::tags(nm, needs_unread)?); tags.append(&mut nm::tags(nm, needs_unread)?);
Ok(tags) Ok(tags)
} }
#[instrument(skip_all, fields(thread_id=thread_id, request_id=request_id()))] #[instrument(skip_all, fields(thread_id=thread_id, rid=request_id()))]
async fn thread<'ctx>(&self, ctx: &Context<'ctx>, thread_id: String) -> Result<Thread, Error> { async fn thread<'ctx>(&self, ctx: &Context<'ctx>, thread_id: String) -> Result<Thread, Error> {
let nm = ctx.data_unchecked::<Notmuch>(); let nm = ctx.data_unchecked::<Notmuch>();
let cacher = ctx.data_unchecked::<FilesystemCacher>();
let pool = ctx.data_unchecked::<PgPool>(); let pool = ctx.data_unchecked::<PgPool>();
let config = ctx.data_unchecked::<Config>();
let debug_content_tree = ctx let debug_content_tree = ctx
.look_ahead() .look_ahead()
.field("messages") .field("messages")
@@ -487,7 +531,7 @@ impl QueryRoot {
.field("contentTree") .field("contentTree")
.exists(); .exists();
if newsreader::is_newsreader_thread(&thread_id) { if newsreader::is_newsreader_thread(&thread_id) {
Ok(newsreader::thread(config, pool, thread_id).await?) Ok(newsreader::thread(cacher, pool, thread_id).await?)
} else { } else {
Ok(nm::thread(nm, pool, thread_id, debug_content_tree).await?) Ok(nm::thread(nm, pool, thread_id, debug_content_tree).await?)
} }
@@ -552,7 +596,7 @@ async fn tantivy_search(
pub struct Mutation; pub struct Mutation;
#[Object] #[Object]
impl Mutation { impl Mutation {
#[instrument(skip_all, fields(query=query, unread=unread, request_id=request_id()))] #[instrument(skip_all, fields(query=query, unread=unread, rid=request_id()))]
async fn set_read_status<'ctx>( async fn set_read_status<'ctx>(
&self, &self,
ctx: &Context<'ctx>, ctx: &Context<'ctx>,
@@ -571,7 +615,7 @@ impl Mutation {
nm::set_read_status(nm, &query, unread).await?; nm::set_read_status(nm, &query, unread).await?;
Ok(true) Ok(true)
} }
#[instrument(skip_all, fields(query=query, tag=tag, request_id=request_id()))] #[instrument(skip_all, fields(query=query, tag=tag, rid=request_id()))]
async fn tag_add<'ctx>( async fn tag_add<'ctx>(
&self, &self,
ctx: &Context<'ctx>, ctx: &Context<'ctx>,
@@ -583,7 +627,7 @@ impl Mutation {
nm.tag_add(&tag, &query)?; nm.tag_add(&tag, &query)?;
Ok(true) Ok(true)
} }
#[instrument(skip_all, fields(query=query, tag=tag, request_id=request_id()))] #[instrument(skip_all, fields(query=query, tag=tag, rid=request_id()))]
async fn tag_remove<'ctx>( async fn tag_remove<'ctx>(
&self, &self,
ctx: &Context<'ctx>, ctx: &Context<'ctx>,
@@ -606,14 +650,16 @@ impl Mutation {
Ok(true) Ok(true)
} }
#[instrument(skip_all, fields(request_id=request_id()))] #[instrument(skip_all, fields(rid=request_id()))]
async fn refresh<'ctx>(&self, ctx: &Context<'ctx>) -> Result<bool, Error> { async fn refresh<'ctx>(&self, ctx: &Context<'ctx>) -> Result<bool, Error> {
let nm = ctx.data_unchecked::<Notmuch>(); let nm = ctx.data_unchecked::<Notmuch>();
let cacher = ctx.data_unchecked::<FilesystemCacher>();
let pool = ctx.data_unchecked::<PgPool>();
info!("{}", String::from_utf8_lossy(&nm.new()?)); info!("{}", String::from_utf8_lossy(&nm.new()?));
newsreader::refresh(pool, cacher).await?;
#[cfg(feature = "tantivy")] #[cfg(feature = "tantivy")]
{ {
let tantivy = ctx.data_unchecked::<TantivyConnection>(); let tantivy = ctx.data_unchecked::<TantivyConnection>();
let pool = ctx.data_unchecked::<PgPool>();
// TODO: parallelize // TODO: parallelize
tantivy.refresh(pool).await?; tantivy.refresh(pool).await?;
} }

View File

@@ -7,22 +7,29 @@ pub mod nm;
#[cfg(feature = "tantivy")] #[cfg(feature = "tantivy")]
pub mod tantivy; pub mod tantivy;
use std::{collections::HashMap, convert::Infallible, fmt, str::FromStr, sync::Arc}; use std::{
collections::{HashMap, HashSet},
convert::Infallible,
fmt,
str::FromStr,
sync::Arc,
};
use async_trait::async_trait; use async_trait::async_trait;
use cacher::{Cacher, FilesystemCacher}; use cacher::{Cacher, FilesystemCacher};
use css_inline::{CSSInliner, InlineError, InlineOptions}; use css_inline::{CSSInliner, InlineError, InlineOptions};
use linkify::{LinkFinder, LinkKind}; use linkify::{LinkFinder, LinkKind};
use log::{error, info, warn}; use log::{debug, error, info, warn};
use lol_html::{ use lol_html::{
element, errors::RewritingError, html_content::ContentType, rewrite_str, text, element, errors::RewritingError, html_content::ContentType, rewrite_str, text,
RewriteStrSettings, RewriteStrSettings,
}; };
use maplit::{hashmap, hashset}; use maplit::{hashmap, hashset};
use regex::Regex;
use reqwest::StatusCode;
use scraper::{Html, Selector}; use scraper::{Html, Selector};
use sqlx::types::time::PrimitiveDateTime; use sqlx::types::time::PrimitiveDateTime;
use thiserror::Error; use thiserror::Error;
use tokio::sync::Mutex;
use url::Url; use url::Url;
use crate::{ use crate::{
@@ -58,6 +65,8 @@ pub enum TransformError {
ReqwestError(#[from] reqwest::Error), ReqwestError(#[from] reqwest::Error),
#[error("failed to parse HTML: {0}")] #[error("failed to parse HTML: {0}")]
HtmlParsingError(String), HtmlParsingError(String),
#[error("got a retryable error code {0} for {1}")]
RetryableHttpStatusError(StatusCode, String),
} }
struct SanitizeHtml<'a> { struct SanitizeHtml<'a> {
@@ -88,70 +97,49 @@ struct StripHtml;
#[async_trait] #[async_trait]
impl Transformer for StripHtml { impl Transformer for StripHtml {
fn should_run(&self, _: &Option<Url>, html: &str) -> bool { fn should_run(&self, link: &Option<Url>, html: &str) -> bool {
debug!("StripHtml should_run {link:?} {}", html.contains("<"));
// Lame test // Lame test
html.contains("<") html.contains("<")
} }
async fn transform(&self, _: &Option<Url>, html: &str) -> Result<String, TransformError> { async fn transform(&self, link: &Option<Url>, html: &str) -> Result<String, TransformError> {
debug!("StripHtml {link:?}");
let mut text = String::new(); let mut text = String::new();
let element_content_handlers = vec![text!("*", |t| { let element_content_handlers = vec![
text += t.as_str(); element!("style", |el| {
Ok(()) el.remove();
})]; Ok(())
let _ = rewrite_str( }),
element!("script", |el| {
el.remove();
Ok(())
}),
];
let html = rewrite_str(
html, html,
RewriteStrSettings { RewriteStrSettings {
element_content_handlers, element_content_handlers,
..RewriteStrSettings::default() ..RewriteStrSettings::default()
}, },
)?; )?;
let element_content_handlers = vec![text!("*", |t| {
text += t.as_str();
Ok(())
})];
let _ = rewrite_str(
&html,
RewriteStrSettings {
element_content_handlers,
..RewriteStrSettings::default()
},
)?;
let re = Regex::new(r"\s+").expect("failed to parse regex");
let text = re.replace_all(&text, " ").to_string();
Ok(text) Ok(text)
} }
} }
struct InlineRemoteStyle<'a> {
base_url: &'a Option<Url>,
}
#[async_trait]
impl<'a> Transformer for InlineRemoteStyle<'a> {
async fn transform(&self, _: &Option<Url>, html: &str) -> Result<String, TransformError> {
let css = concat!(
"/* chrome-default.css */\n",
include_str!("chrome-default.css"),
"\n/* mvp.css */\n",
include_str!("mvp.css"),
"\n/* Xinu Specific overrides */\n",
include_str!("custom.css"),
);
let inline_opts = InlineOptions {
//inline_style_tags: true,
//keep_style_tags: false,
//keep_link_tags: true,
base_url: self.base_url.clone(),
//load_remote_stylesheets: true,
//preallocate_node_capacity: 32,
..InlineOptions::default()
};
//info!("HTML:\n{html}");
info!("base_url: {:#?}", self.base_url);
Ok(
match CSSInliner::options()
.base_url(self.base_url.clone())
.build()
.inline(&html)
{
Ok(inlined_html) => inlined_html,
Err(err) => {
error!("failed to inline remote CSS: {err}");
html.to_string()
}
},
)
}
}
struct InlineStyle; struct InlineStyle;
#[async_trait] #[async_trait]
@@ -160,10 +148,10 @@ impl Transformer for InlineStyle {
let css = concat!( let css = concat!(
"/* chrome-default.css */\n", "/* chrome-default.css */\n",
include_str!("chrome-default.css"), include_str!("chrome-default.css"),
"\n/* mvp.css */\n", //"\n/* mvp.css */\n",
include_str!("mvp.css"), //include_str!("mvp.css"),
"\n/* Xinu Specific overrides */\n", //"\n/* Xinu Specific overrides */\n",
include_str!("custom.css"), //include_str!("custom.css"),
); );
let inline_opts = InlineOptions { let inline_opts = InlineOptions {
inline_style_tags: true, inline_style_tags: true,
@@ -269,13 +257,13 @@ impl Transformer for AddOutlink {
} }
} }
struct SlurpContents { struct SlurpContents<'c> {
cacher: Arc<Mutex<FilesystemCacher>>, cacher: &'c FilesystemCacher,
inline_css: bool, inline_css: bool,
site_selectors: HashMap<String, Vec<Selector>>, site_selectors: HashMap<String, Vec<Selector>>,
} }
impl SlurpContents { impl<'c> SlurpContents<'c> {
fn get_selectors(&self, link: &Url) -> Option<&[Selector]> { fn get_selectors(&self, link: &Url) -> Option<&[Selector]> {
for (host, selector) in self.site_selectors.iter() { for (host, selector) in self.site_selectors.iter() {
if link.host_str().map(|h| h.contains(host)).unwrap_or(false) { if link.host_str().map(|h| h.contains(host)).unwrap_or(false) {
@@ -287,36 +275,79 @@ impl SlurpContents {
} }
#[async_trait] #[async_trait]
impl Transformer for SlurpContents { impl<'c> Transformer for SlurpContents<'c> {
fn should_run(&self, link: &Option<Url>, _: &str) -> bool { fn should_run(&self, link: &Option<Url>, html: &str) -> bool {
debug!("SlurpContents should_run {link:?}");
let mut will_slurp = false;
if let Some(link) = link { if let Some(link) = link {
return self.get_selectors(link).is_some(); will_slurp = self.get_selectors(link).is_some();
} }
false if !will_slurp && self.inline_css {
return InlineStyle {}.should_run(link, html);
}
will_slurp
} }
async fn transform(&self, link: &Option<Url>, html: &str) -> Result<String, TransformError> { async fn transform(&self, link: &Option<Url>, html: &str) -> Result<String, TransformError> {
debug!("SlurpContents {link:?}");
let retryable_status: HashSet<StatusCode> = vec![
StatusCode::UNAUTHORIZED,
StatusCode::FORBIDDEN,
StatusCode::REQUEST_TIMEOUT,
StatusCode::TOO_MANY_REQUESTS,
]
.into_iter()
.collect();
if let Some(test_link) = link {
// If SlurpContents is configured for inline CSS, but no
// configuration found for this site, use the local InlineStyle
// transform.
if self.inline_css && self.get_selectors(test_link).is_none() {
debug!("local inline CSS for {link:?}");
return InlineStyle {}.transform(link, html).await;
}
}
let Some(link) = link else { let Some(link) = link else {
return Ok(html.to_string()); return Ok(html.to_string());
}; };
let Some(selectors) = self.get_selectors(&link) else { let Some(selectors) = self.get_selectors(&link) else {
return Ok(html.to_string()); return Ok(html.to_string());
}; };
let cacher = self.cacher.lock().await; let cacher = self.cacher;
let body = if let Some(body) = cacher.get(link.as_str()) { let body = if let Some(body) = cacher.get(link.as_str()) {
info!("cache hit for {link}");
String::from_utf8_lossy(&body).to_string() String::from_utf8_lossy(&body).to_string()
} else { } else {
let body = reqwest::get(link.as_str()).await?.text().await?; let resp = reqwest::get(link.as_str()).await?;
let status = resp.status();
if status.is_server_error() || retryable_status.contains(&status) {
return Err(TransformError::RetryableHttpStatusError(
status,
link.to_string(),
));
}
if !status.is_success() {
return Ok(html.to_string());
}
let body = resp.text().await?;
cacher.set(link.as_str(), body.as_bytes()); cacher.set(link.as_str(), body.as_bytes());
body body
}; };
let body = Arc::new(body); let body = Arc::new(body);
let base_url = Some(link.clone()); let base_url = Some(link.clone());
let body = if self.inline_css { let body = if self.inline_css {
debug!("inlining CSS for {link}");
let inner_body = Arc::clone(&body); let inner_body = Arc::clone(&body);
let res = tokio::task::spawn_blocking(move || { let res = tokio::task::spawn_blocking(move || {
let css = concat!(
"/* chrome-default.css */\n",
include_str!("chrome-default.css"),
"\n/* vars.css */\n",
include_str!("../static/vars.css"),
//"\n/* Xinu Specific overrides */\n",
//include_str!("custom.css"),
);
let res = CSSInliner::options() let res = CSSInliner::options()
.base_url(base_url) .base_url(base_url)
.extra_css(Some(std::borrow::Cow::Borrowed(css)))
.build() .build()
.inline(&inner_body); .inline(&inner_body);
@@ -337,6 +368,7 @@ impl Transformer for SlurpContents {
} }
} }
} else { } else {
debug!("using body as-is for {link:?}");
Arc::into_inner(body).expect("failed to take body out of Arc") Arc::into_inner(body).expect("failed to take body out of Arc")
}; };
@@ -650,7 +682,7 @@ fn compute_offset_limit(
first: Option<i32>, first: Option<i32>,
last: Option<i32>, last: Option<i32>,
) -> (i32, i32) { ) -> (i32, i32) {
let default_page_size = 100; let default_page_size = 10000;
match (after, before, first, last) { match (after, before, first, last) {
// Reasonable defaults // Reasonable defaults
(None, None, None, None) => (0, default_page_size), (None, None, None, None) => (0, default_page_size),
@@ -741,7 +773,19 @@ impl Query {
for uid in &self.uids { for uid in &self.uids {
parts.push(uid.clone()); parts.push(uid.clone());
} }
parts.extend(self.remainder.clone()); for r in &self.remainder {
// Rewrite "to:" to include ExtraTo:. ExtraTo: is configured in
// notmuch-config to index Delivered-To and X-Original-To headers.
if r.starts_with("to:") {
parts.push("(".to_string());
parts.push(r.to_string());
parts.push("OR".to_string());
parts.push(r.replace("to:", "ExtraTo:"));
parts.push(")".to_string());
} else {
parts.push(r.to_string());
}
}
parts.join(" ") parts.join(" ")
} }
} }
@@ -761,7 +805,17 @@ impl FromStr for Query {
if word == "is:unread" { if word == "is:unread" {
unread_only = true unread_only = true
} else if word.starts_with("tag:") { } else if word.starts_with("tag:") {
tags.push(word["tag:".len()..].to_string()); let t = &word["tag:".len()..];
// Per-address emails are faked as `tag:@<domain>/<username>`, rewrite to `to:` form
if t.starts_with('@') && t.contains('.') {
let t = match t.split_once('/') {
None => format!("to:{t}"),
Some((domain, user)) => format!("to:{user}{domain}"),
};
remainder.push(t);
} else {
tags.push(t.to_string());
};
/* /*
} else if word.starts_with("tag:") { } else if word.starts_with("tag:") {

View File

@@ -1,23 +1,21 @@
use std::sync::Arc; use std::collections::HashMap;
use cacher::FilesystemCacher; use cacher::FilesystemCacher;
use log::info; use futures::{stream::FuturesUnordered, StreamExt};
use letterbox_shared::compute_color;
use log::{error, info};
use maplit::hashmap; use maplit::hashmap;
use scraper::Selector; use scraper::Selector;
use shared::compute_color;
use sqlx::postgres::PgPool; use sqlx::postgres::PgPool;
use tokio::sync::Mutex;
use tracing::instrument; use tracing::instrument;
use url::Url; use url::Url;
use crate::{ use crate::{
clean_title, compute_offset_limit, clean_title, compute_offset_limit,
config::Config,
error::ServerError, error::ServerError,
graphql::{Corpus, NewsPost, Tag, Thread, ThreadSummary}, graphql::{Corpus, NewsPost, Tag, Thread, ThreadSummary},
thread_summary_from_row, AddOutlink, EscapeHtml, FrameImages, InlineRemoteStyle, Query, thread_summary_from_row, AddOutlink, FrameImages, Query, SanitizeHtml, SlurpContents,
SanitizeHtml, SlurpContents, ThreadSummaryRecord, Transformer, NEWSREADER_TAG_PREFIX, StripHtml, ThreadSummaryRecord, Transformer, NEWSREADER_TAG_PREFIX, NEWSREADER_THREAD_PREFIX,
NEWSREADER_THREAD_PREFIX,
}; };
pub fn is_newsreader_query(query: &Query) -> bool { pub fn is_newsreader_query(query: &Query) -> bool {
@@ -174,7 +172,7 @@ pub async fn tags(pool: &PgPool, _needs_unread: bool) -> Result<Vec<Tag>, Server
#[instrument(name = "newsreader::thread", skip_all, fields(thread_id=%thread_id))] #[instrument(name = "newsreader::thread", skip_all, fields(thread_id=%thread_id))]
pub async fn thread( pub async fn thread(
config: &Config, cacher: &FilesystemCacher,
pool: &PgPool, pool: &PgPool,
thread_id: String, thread_id: String,
) -> Result<Thread, ServerError> { ) -> Result<Thread, ServerError> {
@@ -191,65 +189,12 @@ pub async fn thread(
let site = r.name.unwrap_or("NO SITE".to_string()); let site = r.name.unwrap_or("NO SITE".to_string());
// TODO: remove the various places that have this as an Option // TODO: remove the various places that have this as an Option
let link = Some(Url::parse(&r.link)?); let link = Some(Url::parse(&r.link)?);
let mut body = r.summary.unwrap_or("NO SUMMARY".to_string()); let mut body = r.clean_summary.unwrap_or("NO SUMMARY".to_string());
let cacher = Arc::new(Mutex::new(FilesystemCacher::new(&config.slurp_cache_path)?)); let body_transformers: Vec<Box<dyn Transformer>> = vec![
let body_tranformers: Vec<Box<dyn Transformer>> = vec![
Box::new(SlurpContents { Box::new(SlurpContents {
cacher, cacher,
// TODO: make this true when bulma is finally removed inline_css: true,
inline_css: false, site_selectors: slurp_contents_selectors(),
site_selectors: hashmap![
"atmeta.com".to_string() => vec![
Selector::parse("div.entry-content").unwrap(),
],
"blog.prusa3d.com".to_string() => vec![
Selector::parse("article.content .post-block").unwrap(),
],
"blog.cloudflare.com".to_string() => vec![
Selector::parse(".author-lists .author-name-tooltip").unwrap(),
Selector::parse(".post-full-content").unwrap()
],
"blog.zsa.io".to_string() => vec![
Selector::parse("section.blog-article").unwrap(),
],
"engineering.fb.com".to_string() => vec![
Selector::parse("article").unwrap(),
],
"grafana.com".to_string() => vec![
Selector::parse(".blog-content").unwrap(),
],
"hackaday.com".to_string() => vec![
Selector::parse("div.entry-featured-image").unwrap(),
Selector::parse("div.entry-content").unwrap()
],
"ingowald.blog".to_string() => vec![
Selector::parse("article").unwrap(),
],
"jvns.ca".to_string() => vec![
Selector::parse("article").unwrap(),
],
"mitchellh.com".to_string() => vec![Selector::parse("div.w-full").unwrap()],
"natwelch.com".to_string() => vec![
Selector::parse("article div.prose").unwrap(),
],
"rustacean-station.org".to_string() => vec![
Selector::parse("article").unwrap(),
],
"slashdot.org".to_string() => vec![
Selector::parse("span.story-byline").unwrap(),
Selector::parse("div.p").unwrap(),
],
"trofi.github.io".to_string() => vec![
Selector::parse("#content").unwrap(),
],
"www.redox-os.org".to_string() => vec![
Selector::parse("div.content").unwrap(),
],
"www.smbc-comics.com".to_string() => vec![
Selector::parse("img#cc-comic").unwrap(),
Selector::parse("div#aftercomic img").unwrap(),
],
],
}), }),
Box::new(FrameImages), Box::new(FrameImages),
Box::new(AddOutlink), Box::new(AddOutlink),
@@ -260,7 +205,7 @@ pub async fn thread(
base_url: &link, base_url: &link,
}), }),
]; ];
for t in body_tranformers.iter() { for t in body_transformers.iter() {
if t.should_run(&link, &body) { if t.should_run(&link, &body) {
body = t.transform(&link, &body).await?; body = t.transform(&link, &body).await?;
} }
@@ -308,3 +253,132 @@ pub async fn set_read_status<'ctx>(
} }
Ok(true) Ok(true)
} }
#[instrument(name = "newsreader::refresh", skip_all)]
pub async fn refresh<'ctx>(pool: &PgPool, cacher: &FilesystemCacher) -> Result<bool, ServerError> {
async fn update_search_summary(
pool: &PgPool,
cacher: &FilesystemCacher,
link: String,
body: String,
id: i32,
) -> Result<(), ServerError> {
let slurp_contents = SlurpContents {
cacher,
inline_css: true,
site_selectors: slurp_contents_selectors(),
};
let strip_html = StripHtml;
info!("adding {link} to search index");
let mut body = body;
if let Ok(link) = Url::parse(&link) {
let link = Some(link);
if slurp_contents.should_run(&link, &body) {
body = slurp_contents.transform(&link, &body).await?;
}
} else {
error!("failed to parse link: {}", link);
}
body = strip_html.transform(&None, &body).await?;
sqlx::query!(
"UPDATE post SET search_summary = $1 WHERE id = $2",
body,
id
)
.execute(pool)
.await?;
Ok(())
}
let mut unordered: FuturesUnordered<_> = sqlx::query_file!("sql/need-search-summary.sql",)
.fetch_all(pool)
.await?
.into_iter()
.filter_map(|r| {
let Some(body) = r.clean_summary else {
error!("clean_summary missing for {}", r.link);
return None;
};
let id = r.id;
Some(update_search_summary(pool, cacher, r.link, body, id))
})
.collect();
while let Some(res) = unordered.next().await {
//let res = res;
match res {
Ok(()) => {}
Err(err) => {
info!("failed refresh {err:?}");
// TODO:
//fd.error = Some(err);
}
};
}
Ok(true)
}
fn slurp_contents_selectors() -> HashMap<String, Vec<Selector>> {
hashmap![
"atmeta.com".to_string() => vec![
Selector::parse("div.entry-content").unwrap(),
],
"blog.prusa3d.com".to_string() => vec![
Selector::parse("article.content .post-block").unwrap(),
],
"blog.cloudflare.com".to_string() => vec![
Selector::parse(".author-lists .author-name-tooltip").unwrap(),
Selector::parse(".post-full-content").unwrap()
],
"blog.zsa.io".to_string() => vec![
Selector::parse("section.blog-article").unwrap(),
],
"engineering.fb.com".to_string() => vec![
Selector::parse("article").unwrap(),
],
"grafana.com".to_string() => vec![
Selector::parse(".blog-content").unwrap(),
],
"hackaday.com".to_string() => vec![
Selector::parse("div.entry-featured-image").unwrap(),
Selector::parse("div.entry-content").unwrap()
],
"ingowald.blog".to_string() => vec![
Selector::parse("article").unwrap(),
],
"jvns.ca".to_string() => vec![
Selector::parse("article").unwrap(),
],
"mitchellh.com".to_string() => vec![Selector::parse("div.w-full").unwrap()],
"natwelch.com".to_string() => vec![
Selector::parse("article div.prose").unwrap(),
],
"rustacean-station.org".to_string() => vec![
Selector::parse("article").unwrap(),
],
"slashdot.org".to_string() => vec![
Selector::parse("span.story-byline").unwrap(),
Selector::parse("div.p").unwrap(),
],
"theonion.com".to_string() => vec![
// Single image joke w/ title
Selector::parse("article > section > div > figure").unwrap(),
// Single cartoon
Selector::parse("article > div > div > figure").unwrap(),
// Image at top of article
Selector::parse("article > header > div > div > figure").unwrap(),
// Article body
Selector::parse("article .entry-content > *").unwrap(),
],
"trofi.github.io".to_string() => vec![
Selector::parse("#content").unwrap(),
],
"www.redox-os.org".to_string() => vec![
Selector::parse("div.content").unwrap(),
],
"www.smbc-comics.com".to_string() => vec![
Selector::parse("img#cc-comic").unwrap(),
Selector::parse("div#aftercomic img").unwrap(),
],
]
}

View File

@@ -1,14 +1,10 @@
use std::{ use std::{collections::HashMap, fs::File};
collections::HashMap,
fs::File,
hash::{DefaultHasher, Hash, Hasher},
time::Instant,
};
use letterbox_notmuch::Notmuch;
use letterbox_shared::compute_color;
use log::{error, info, warn}; use log::{error, info, warn};
use mailparse::{parse_content_type, parse_mail, MailHeader, MailHeaderMap, ParsedMail}; use mailparse::{parse_content_type, parse_mail, MailHeader, MailHeaderMap, ParsedMail};
use memmap::MmapOptions; use memmap::MmapOptions;
use notmuch::Notmuch;
use sqlx::PgPool; use sqlx::PgPool;
use tracing::instrument; use tracing::instrument;
@@ -43,7 +39,9 @@ pub fn is_notmuch_thread_or_id(id: &str) -> bool {
} }
// TODO(wathiede): decide good error type // TODO(wathiede): decide good error type
pub fn threadset_to_messages(thread_set: notmuch::ThreadSet) -> Result<Vec<Message>, ServerError> { pub fn threadset_to_messages(
thread_set: letterbox_notmuch::ThreadSet,
) -> Result<Vec<Message>, ServerError> {
for t in thread_set.0 { for t in thread_set.0 {
for _tn in t.0 {} for _tn in t.0 {}
} }
@@ -105,7 +103,6 @@ pub async fn search(
#[instrument(name="nm::tags", skip_all, fields(needs_unread=needs_unread))] #[instrument(name="nm::tags", skip_all, fields(needs_unread=needs_unread))]
pub fn tags(nm: &Notmuch, needs_unread: bool) -> Result<Vec<Tag>, ServerError> { pub fn tags(nm: &Notmuch, needs_unread: bool) -> Result<Vec<Tag>, ServerError> {
let now = Instant::now();
let unread_msg_cnt: HashMap<String, usize> = if needs_unread { let unread_msg_cnt: HashMap<String, usize> = if needs_unread {
// 10000 is an arbitrary number, if there's more than 10k unread messages, we'll // 10000 is an arbitrary number, if there's more than 10k unread messages, we'll
// get an inaccurate count. // get an inaccurate count.
@@ -121,13 +118,11 @@ pub fn tags(nm: &Notmuch, needs_unread: bool) -> Result<Vec<Tag>, ServerError> {
} else { } else {
HashMap::new() HashMap::new()
}; };
let tags = nm let tags: Vec<_> = nm
.tags()? .tags()?
.into_iter() .into_iter()
.map(|tag| { .map(|tag| {
let mut hasher = DefaultHasher::new(); let hex = compute_color(&tag);
tag.hash(&mut hasher);
let hex = format!("#{:06x}", hasher.finish() % (1 << 24));
let unread = if needs_unread { let unread = if needs_unread {
*unread_msg_cnt.get(&tag).unwrap_or(&0) *unread_msg_cnt.get(&tag).unwrap_or(&0)
} else { } else {
@@ -140,8 +135,24 @@ pub fn tags(nm: &Notmuch, needs_unread: bool) -> Result<Vec<Tag>, ServerError> {
unread, unread,
} }
}) })
.chain(
nm.unread_recipients()?
.into_iter()
.filter_map(|(name, unread)| {
let Some(idx) = name.find('@') else {
return None;
};
let name = format!("{}/{}", &name[idx..], &name[..idx]);
let bg_color = compute_color(&name);
Some(Tag {
name,
fg_color: "white".to_string(),
bg_color,
unread,
})
}),
)
.collect(); .collect();
info!("Fetching tags took {} seconds", now.elapsed().as_secs_f32());
Ok(tags) Ok(tags)
} }
@@ -185,12 +196,14 @@ pub async fn thread(
let to = email_addresses(&path, &m, "to")?; let to = email_addresses(&path, &m, "to")?;
let cc = email_addresses(&path, &m, "cc")?; let cc = email_addresses(&path, &m, "cc")?;
let delivered_to = email_addresses(&path, &m, "delivered-to")?.pop();
let x_original_to = email_addresses(&path, &m, "x-original-to")?.pop();
let subject = m.headers.get_first_value("subject"); let subject = m.headers.get_first_value("subject");
let timestamp = m let timestamp = m
.headers .headers
.get_first_value("date") .get_first_value("date")
.and_then(|d| mailparse::dateparse(&d).ok()); .and_then(|d| mailparse::dateparse(&d).ok());
let cid_prefix = shared::urls::cid_prefix(None, &id); let cid_prefix = letterbox_shared::urls::cid_prefix(None, &id);
let base_url = None; let base_url = None;
let mut part_addr = Vec::new(); let mut part_addr = Vec::new();
part_addr.push(id.to_string()); part_addr.push(id.to_string());
@@ -223,7 +236,7 @@ pub async fn thread(
} }
format!( format!(
r#"<p class="view-part-text-plain">{}</p>"#, r#"<p class="view-part-text-plain font-mono whitespace-pre-line">{}</p>"#,
// Trim newlines to prevent excessive white space at the beginning/end of // Trim newlines to prevent excessive white space at the beginning/end of
// presenation. Leave tabs and spaces incase plain text attempts to center a // presenation. Leave tabs and spaces incase plain text attempts to center a
// header on the first line. // header on the first line.
@@ -304,6 +317,8 @@ pub async fn thread(
body, body,
path, path,
attachments, attachments,
delivered_to,
x_original_to,
}); });
} }
messages.reverse(); messages.reverse();
@@ -578,7 +593,7 @@ fn flatten_body_parts(parts: &[Body]) -> Body {
.map(|p| match p { .map(|p| match p {
Body::PlainText(PlainText { text, .. }) => { Body::PlainText(PlainText { text, .. }) => {
format!( format!(
r#"<p class="view-part-text-plain">{}</p>"#, r#"<p class="view-part-text-plain font-mono whitespace-pre-line">{}</p>"#,
// Trim newlines to prevent excessive white space at the beginning/end of // Trim newlines to prevent excessive white space at the beginning/end of
// presenation. Leave tabs and spaces incase plain text attempts to center a // presenation. Leave tabs and spaces incase plain text attempts to center a
// header on the first line. // header on the first line.
@@ -696,7 +711,6 @@ fn walk_attachments_inner<T, F: Fn(&ParsedMail, &[usize]) -> Option<T> + Copy>(
fn extract_attachments(m: &ParsedMail, id: &str) -> Result<Vec<Attachment>, ServerError> { fn extract_attachments(m: &ParsedMail, id: &str) -> Result<Vec<Attachment>, ServerError> {
let mut attachments = Vec::new(); let mut attachments = Vec::new();
for (idx, sp) in m.subparts.iter().enumerate() { for (idx, sp) in m.subparts.iter().enumerate() {
info!("sp: {:?}", sp.headers);
if let Some(attachment) = extract_attachment(sp, id, &[idx]) { if let Some(attachment) = extract_attachment(sp, id, &[idx]) {
// Filter out inline attachements, they're flattened into the body of the message. // Filter out inline attachements, they're flattened into the body of the message.
if attachment.disposition == DispositionType::Attachment { if attachment.disposition == DispositionType::Attachment {

42
server/static/vars.css Normal file
View File

@@ -0,0 +1,42 @@
:root {
--active-brightness: 0.85;
--border-radius: 5px;
--box-shadow: 2px 2px 10px;
--color-accent: #118bee15;
--color-bg: #fff;
--color-bg-secondary: #e9e9e9;
--color-link: #118bee;
--color-secondary: #920de9;
--color-secondary-accent: #920de90b;
--color-shadow: #f4f4f4;
--color-table: #118bee;
--color-text: #000;
--color-text-secondary: #999;
--color-scrollbar: #cacae8;
--font-family: -apple-system, BlinkMacSystemFont, "Segoe UI", Roboto, Oxygen-Sans, Ubuntu, Cantarell, "Helvetica Neue", sans-serif;
--hover-brightness: 1.2;
--justify-important: center;
--justify-normal: left;
--line-height: 1.5;
/*
--width-card: 285px;
--width-card-medium: 460px;
--width-card-wide: 800px;
*/
--width-content: 1080px;
}
@media (prefers-color-scheme: dark) {
:root[color-mode="user"] {
--color-accent: #0097fc4f;
--color-bg: #333;
--color-bg-secondary: #555;
--color-link: #0097fc;
--color-secondary: #e20de9;
--color-secondary-accent: #e20de94f;
--color-shadow: #bbbbbb20;
--color-table: #0097fc;
--color-text: #f7f7f7;
--color-text-secondary: #aaa;
}
}

View File

@@ -1,11 +1,16 @@
[package] [package]
name = "shared" name = "letterbox-shared"
version = "0.0.115" description = "Shared module for letterbox"
edition = "2021" authors.workspace = true
edition.workspace = true
license.workspace = true
publish.workspace = true
repository.workspace = true
version.workspace = true
# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html # See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html
[dependencies] [dependencies]
build-info = "0.0.38" build-info = "0.0.40"
notmuch = { path = "../notmuch" } letterbox-notmuch = { version = "0.10.0", path = "../notmuch", registry = "xinu" }
serde = { version = "1.0.147", features = ["derive"] } serde = { version = "1.0.147", features = ["derive"] }

View File

@@ -1,7 +1,7 @@
use std::hash::{DefaultHasher, Hash, Hasher}; use std::hash::{DefaultHasher, Hash, Hasher};
use build_info::{BuildInfo, VersionControl}; use build_info::{BuildInfo, VersionControl};
use notmuch::SearchSummary; use letterbox_notmuch::SearchSummary;
use serde::{Deserialize, Serialize}; use serde::{Deserialize, Serialize};
#[derive(Serialize, Deserialize, Debug)] #[derive(Serialize, Deserialize, Debug)]

View File

@@ -1,16 +1,15 @@
[package] [package]
version = "0.0.115" name = "letterbox-web"
name = "letterbox" description = "Web frontend for letterbox"
repository = "https://github.com/seed-rs/seed-quickstart" authors.workspace = true
authors = ["Bill Thiede <git@xinu.tv>"] edition.workspace = true
description = "App Description" license.workspace = true
categories = ["category"] publish.workspace = true
license = "MIT" repository.workspace = true
readme = "./README.md" version.workspace = true
edition = "2021"
[build-dependencies] [build-dependencies]
build-info-build = "0.0.38" build-info-build = "0.0.40"
[dev-dependencies] [dev-dependencies]
wasm-bindgen-test = "0.3.33" wasm-bindgen-test = "0.3.33"
@@ -22,18 +21,22 @@ seed = { version = "0.10.0", features = ["routing"] }
#seed = "0.9.2" #seed = "0.9.2"
console_log = { version = "0.1.0", registry = "xinu" } console_log = { version = "0.1.0", registry = "xinu" }
serde = { version = "1.0.147", features = ["derive"] } serde = { version = "1.0.147", features = ["derive"] }
notmuch = { path = "../notmuch" } itertools = "0.14.0"
shared = { path = "../shared" }
itertools = "0.10.5"
serde_json = { version = "1.0.93", features = ["unbounded_depth"] } serde_json = { version = "1.0.93", features = ["unbounded_depth"] }
chrono = "0.4.31" chrono = "0.4.31"
graphql_client = "0.13.0" graphql_client = "0.14.0"
thiserror = "1.0.50" thiserror = "2.0.0"
seed_hooks = { git = "https://github.com/wathiede/styles_hooks", package = "seed_hooks", branch = "main" } gloo-net = { version = "0.6.0", features = ["json", "serde_json"] }
gloo-net = { version = "0.4.0", features = ["json", "serde_json"] }
human_format = "1.1.0" human_format = "1.1.0"
build-info = "0.0.38" build-info = "0.0.40"
wasm-bindgen = "0.2.95" wasm-bindgen = "=0.2.100"
uuid = { version = "1.13.1", features = [
"js",
] } # direct dep to set js feature, prevents Rng issues
letterbox-shared = { version = "0.10.0", path = "../shared", registry = "xinu" }
letterbox-notmuch = { version = "0.10.0", path = "../notmuch", registry = "xinu" }
seed_hooks = { version = "0.4.0", registry = "xinu" }
strum_macros = "0.27.1"
[package.metadata.wasm-pack.profile.release] [package.metadata.wasm-pack.profile.release]
wasm-opt = ['-Os'] wasm-opt = ['-Os']
@@ -47,4 +50,6 @@ features = [
"MediaQueryList", "MediaQueryList",
"Navigator", "Navigator",
"Window", "Window",
"History",
"ScrollRestoration",
] ]

View File

@@ -0,0 +1,3 @@
query CatchupQuery($query: String!) {
catchup(query: $query)
}

View File

@@ -671,6 +671,30 @@
} }
} }
}, },
{
"args": [],
"deprecationReason": null,
"description": null,
"isDeprecated": false,
"name": "xOriginalTo",
"type": {
"kind": "OBJECT",
"name": "Email",
"ofType": null
}
},
{
"args": [],
"deprecationReason": null,
"description": null,
"isDeprecated": false,
"name": "deliveredTo",
"type": {
"kind": "OBJECT",
"name": "Email",
"ofType": null
}
},
{ {
"args": [], "args": [],
"deprecationReason": null, "deprecationReason": null,
@@ -1268,6 +1292,45 @@
} }
} }
}, },
{
"args": [
{
"defaultValue": null,
"description": null,
"name": "query",
"type": {
"kind": "NON_NULL",
"name": null,
"ofType": {
"kind": "SCALAR",
"name": "String",
"ofType": null
}
}
}
],
"deprecationReason": null,
"description": null,
"isDeprecated": false,
"name": "catchup",
"type": {
"kind": "NON_NULL",
"name": null,
"ofType": {
"kind": "LIST",
"name": null,
"ofType": {
"kind": "NON_NULL",
"name": null,
"ofType": {
"kind": "SCALAR",
"name": "String",
"ofType": null
}
}
}
}
},
{ {
"args": [ "args": [
{ {

View File

@@ -31,6 +31,14 @@ query ShowThreadQuery($threadId: String!) {
name name
addr addr
} }
xOriginalTo {
name
addr
}
deliveredTo {
name
addr
}
timestamp timestamp
body { body {
__typename __typename

View File

@@ -4,8 +4,8 @@
<head> <head>
<meta charset="utf-8"> <meta charset="utf-8">
<meta name="viewport" content="width=device-width, initial-scale=1, shrink-to-fit=no"> <meta name="viewport" content="width=device-width, initial-scale=1, shrink-to-fit=no">
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/font-awesome/6.3.0/css/all.min.css" <link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/font-awesome/6.7.2/css/all.min.css"
integrity="sha512-SzlrxWUlpfuzQ+pcUCosxcglQRNAq/DZjVsC0lE40xsADsfeQoEypE+enwcOiGjk/bSuGGKHEyjSoQ1zVisanQ==" integrity="sha512-Evv84Mr4kqVGRNSgIGL/F/aIDqQb7xQ2vcrdIwxfjThSH8CSR7PBEakCr51Ck+w+/U6swU2Im1vVX0SVk9ABhg=="
crossorigin="anonymous" referrerpolicy="no-referrer" /> crossorigin="anonymous" referrerpolicy="no-referrer" />
<link rel="icon" href="https://static.xinu.tv/favicon/letterbox.svg" /> <link rel="icon" href="https://static.xinu.tv/favicon/letterbox.svg" />
<!-- tall thin font for user icon --> <!-- tall thin font for user icon -->
@@ -13,7 +13,9 @@
<link rel="preconnect" href="https://fonts.gstatic.com" crossorigin> <link rel="preconnect" href="https://fonts.gstatic.com" crossorigin>
<link href="https://fonts.googleapis.com/css2?family=Poppins:wght@700&display=swap" rel="stylesheet"> <link href="https://fonts.googleapis.com/css2?family=Poppins:wght@700&display=swap" rel="stylesheet">
<!-- <link data-trunk rel="css" href="static/site-specific.css" /> --> <!-- <link data-trunk rel="css" href="static/site-specific.css" /> -->
<link data-trunk rel="css" href="static/vars.css" />
<link data-trunk rel="tailwind-css" href="./src/tailwind.css" /> <link data-trunk rel="tailwind-css" href="./src/tailwind.css" />
<link data-trunk rel="css" href="static/overrides.css" />
</head> </head>
<body> <body>

View File

@@ -12,6 +12,14 @@ use serde::{de::DeserializeOwned, Serialize};
)] )]
pub struct FrontPageQuery; pub struct FrontPageQuery;
#[derive(GraphQLQuery)]
#[graphql(
schema_path = "graphql/schema.json",
query_path = "graphql/catchup.graphql",
response_derives = "Debug"
)]
pub struct CatchupQuery;
#[derive(GraphQLQuery)] #[derive(GraphQLQuery)]
#[graphql( #[graphql(
schema_path = "graphql/schema.json", schema_path = "graphql/schema.json",

View File

@@ -18,6 +18,9 @@ fn main() {
#[cfg(debug_assertions)] #[cfg(debug_assertions)]
console_error_panic_hook::set_once(); console_error_panic_hook::set_once();
#[cfg(debug_assertions)]
let lvl = Level::Debug;
#[cfg(not(debug_assertions))]
let lvl = Level::Info; let lvl = Level::Info;
console_log::init_with_level(lvl).expect("failed to initialize console logging"); console_log::init_with_level(lvl).expect("failed to initialize console logging");
// Mount the `app` to the element with the `id` "app". // Mount the `app` to the element with the `id` "app".

View File

@@ -27,18 +27,23 @@ pub fn unread_query() -> &'static str {
// `init` describes what should happen when your app started. // `init` describes what should happen when your app started.
pub fn init(url: Url, orders: &mut impl Orders<Msg>) -> Model { pub fn init(url: Url, orders: &mut impl Orders<Msg>) -> Model {
let version = shared::build_version(bi); let version = letterbox_shared::build_version(bi);
info!("Build Info: {}", version); info!("Build Info: {}", version);
// Disable restoring to scroll position when navigating
window()
.history()
.expect("couldn't get history")
.set_scroll_restoration(web_sys::ScrollRestoration::Manual)
.expect("failed to set scroll restoration to manual");
if url.hash().is_none() { if url.hash().is_none() {
orders.request_url(urls::search(unread_query(), 0)); orders.request_url(urls::search(unread_query(), 0));
} else { } else {
orders.notify(subs::UrlRequested::new(url)); orders.request_url(url);
}; };
orders.stream(streams::window_event(Ev::Resize, |_| Msg::OnResize));
// TODO(wathiede): only do this while viewing the index? Or maybe add a new message that force // TODO(wathiede): only do this while viewing the index? Or maybe add a new message that force
// 'notmuch new' on the server periodically? // 'notmuch new' on the server periodically?
orders.stream(streams::interval(30_000, || Msg::RefreshStart)); orders.stream(streams::interval(30_000, || Msg::RefreshStart));
orders.subscribe(on_url_changed); orders.subscribe(Msg::OnUrlChanged);
orders.stream(streams::window_event(Ev::Scroll, |_| Msg::WindowScrolled)); orders.stream(streams::window_event(Ev::Scroll, |_| Msg::WindowScrolled));
build_info::build_info!(fn bi); build_info::build_info!(fn bi);
@@ -53,18 +58,23 @@ pub fn init(url: Url, orders: &mut impl Orders<Msg>) -> Model {
client: version, client: version,
server: None, server: None,
}, },
catchup: None,
last_url: Url::current(),
} }
} }
fn on_url_changed(uc: subs::UrlChanged) -> Msg { fn on_url_changed(old: &Url, mut new: Url) -> Msg {
let mut url = uc.0; let did_change = *old != new;
let mut messages = Vec::new();
if did_change {
messages.push(Msg::ScrollToTop)
}
info!( info!(
"url changed '{}', history {}", "url changed\nold '{old}'\nnew '{new}', history {}",
url,
history().length().unwrap_or(0) history().length().unwrap_or(0)
); );
let hpp = url.remaining_hash_path_parts(); let hpp = new.remaining_hash_path_parts();
match hpp.as_slice() { let msg = match hpp.as_slice() {
["t", tid] => Msg::ShowThreadRequest { ["t", tid] => Msg::ShowThreadRequest {
thread_id: tid.to_string(), thread_id: tid.to_string(),
}, },
@@ -101,11 +111,14 @@ fn on_url_changed(uc: subs::UrlChanged) -> Msg {
last: None, last: None,
} }
} }
} };
messages.push(msg);
Msg::MultiMsg(messages)
} }
// `update` describes how to handle each `Msg`. // `update` describes how to handle each `Msg`.
pub fn update(msg: Msg, model: &mut Model, orders: &mut impl Orders<Msg>) { pub fn update(msg: Msg, model: &mut Model, orders: &mut impl Orders<Msg>) {
debug!("update({})", msg);
match msg { match msg {
Msg::Noop => {} Msg::Noop => {}
Msg::RefreshStart => { Msg::RefreshStart => {
@@ -131,7 +144,7 @@ pub fn update(msg: Msg, model: &mut Model, orders: &mut impl Orders<Msg>) {
orders.perform_cmd(async move { Msg::Refresh }); orders.perform_cmd(async move { Msg::Refresh });
} }
Msg::Refresh => { Msg::Refresh => {
orders.perform_cmd(async move { on_url_changed(subs::UrlChanged(Url::current())) }); orders.request_url(Url::current());
} }
Msg::Reload => { Msg::Reload => {
window() window()
@@ -139,7 +152,10 @@ pub fn update(msg: Msg, model: &mut Model, orders: &mut impl Orders<Msg>) {
.reload() .reload()
.expect("failed to reload window"); .expect("failed to reload window");
} }
Msg::OnResize => (), Msg::OnUrlChanged(new_url) => {
orders.send_msg(on_url_changed(&model.last_url, new_url.0.clone()));
model.last_url = new_url.0;
}
Msg::NextPage => { Msg::NextPage => {
match &model.context { match &model.context {
@@ -181,10 +197,7 @@ pub fn update(msg: Msg, model: &mut Model, orders: &mut impl Orders<Msg>) {
}; };
} }
Msg::GoToSearchResults => { Msg::GoToSearchResults => {
let url = urls::search(&model.query, 0); orders.send_msg(Msg::SearchQuery(model.query.clone()));
info!("GoToSearchRestuls Start");
orders.request_url(url);
info!("GoToSearchRestuls End");
} }
Msg::UpdateQuery(query) => model.query = query, Msg::UpdateQuery(query) => model.query = query,
@@ -279,12 +292,16 @@ pub fn update(msg: Msg, model: &mut Model, orders: &mut impl Orders<Msg>) {
) )
}); });
} }
Msg::FrontPageResult(Err(e)) => error!("error FrontPageResult: {e:?}"), Msg::FrontPageResult(Err(e)) => {
orders.send_msg(Msg::Reload);
error!("error FrontPageResult: {e:?}");
}
Msg::FrontPageResult(Ok(graphql_client::Response { Msg::FrontPageResult(Ok(graphql_client::Response {
data: None, data: None,
errors: None, errors: None,
.. ..
})) => { })) => {
orders.send_msg(Msg::Reload);
error!("FrontPageResult no data or errors, should not happen"); error!("FrontPageResult no data or errors, should not happen");
} }
Msg::FrontPageResult(Ok(graphql_client::Response { Msg::FrontPageResult(Ok(graphql_client::Response {
@@ -292,6 +309,7 @@ pub fn update(msg: Msg, model: &mut Model, orders: &mut impl Orders<Msg>) {
errors: Some(e), errors: Some(e),
.. ..
})) => { })) => {
orders.send_msg(Msg::Reload);
error!("FrontPageResult error: {e:?}"); error!("FrontPageResult error: {e:?}");
} }
Msg::FrontPageResult(Ok(graphql_client::Response { Msg::FrontPageResult(Ok(graphql_client::Response {
@@ -307,7 +325,6 @@ pub fn update(msg: Msg, model: &mut Model, orders: &mut impl Orders<Msg>) {
}) })
.collect(), .collect(),
); );
info!("pager {:#?}", data.search.page_info);
let selected_threads = 'context: { let selected_threads = 'context: {
if let Context::SearchResult { if let Context::SearchResult {
results, results,
@@ -388,8 +405,42 @@ pub fn update(msg: Msg, model: &mut Model, orders: &mut impl Orders<Msg>) {
orders.send_msg(Msg::WindowScrolled); orders.send_msg(Msg::WindowScrolled);
} }
Msg::ShowThreadResult(bad) => { Msg::ShowThreadResult(bad) => {
orders.send_msg(Msg::Reload);
error!("show_thread_query error: {bad:#?}"); error!("show_thread_query error: {bad:#?}");
} }
Msg::CatchupRequest { query } => {
orders.perform_cmd(async move {
Msg::CatchupResult(
send_graphql::<_, graphql::catchup_query::ResponseData>(
graphql::CatchupQuery::build_query(graphql::catchup_query::Variables {
query,
}),
)
.await,
)
});
}
Msg::CatchupResult(Ok(graphql_client::Response {
data: Some(data), ..
})) => {
let items = data.catchup;
if items.is_empty() {
orders.send_msg(Msg::GoToSearchResults);
model.catchup = None;
} else {
orders.request_url(urls::thread(&items[0]));
model.catchup = Some(Catchup {
items: items
.into_iter()
.map(|id| CatchupItem { id, seen: false })
.collect(),
});
}
}
Msg::CatchupResult(bad) => {
orders.send_msg(Msg::Reload);
error!("catchup_query error: {bad:#?}");
}
Msg::SelectionSetNone => { Msg::SelectionSetNone => {
if let Context::SearchResult { if let Context::SearchResult {
selected_threads, .. selected_threads, ..
@@ -496,18 +547,19 @@ pub fn update(msg: Msg, model: &mut Model, orders: &mut impl Orders<Msg>) {
} }
Msg::MultiMsg(msgs) => msgs.into_iter().for_each(|msg| update(msg, model, orders)), Msg::MultiMsg(msgs) => msgs.into_iter().for_each(|msg| update(msg, model, orders)),
Msg::CopyToClipboard(text) => { Msg::CopyToClipboard(text) => {
let clipboard = seed::window() let clipboard = seed::window().navigator().clipboard();
.navigator()
.clipboard()
.expect("couldn't get clipboard");
orders.perform_cmd(async move { orders.perform_cmd(async move {
wasm_bindgen_futures::JsFuture::from(clipboard.write_text(&text)) wasm_bindgen_futures::JsFuture::from(clipboard.write_text(&text))
.await .await
.expect("failed to copy to clipboard"); .expect("failed to copy to clipboard");
}); });
} }
Msg::ScrollToTop => {
info!("scrolling to the top");
web_sys::window().unwrap().scroll_to_with_x_and_y(0., 0.);
}
Msg::WindowScrolled => { Msg::WindowScrolled => {
info!("WindowScrolled"); // TODO: model.content_el doesn't go to None like it should when a DOM is recreated and the refrenced element goes away
if let Some(el) = model.content_el.get() { if let Some(el) = model.content_el.get() {
let ih = window() let ih = window()
.inner_height() .inner_height()
@@ -516,7 +568,6 @@ pub fn update(msg: Msg, model: &mut Model, orders: &mut impl Orders<Msg>) {
.value_of(); .value_of();
let r = el.get_bounding_client_rect(); let r = el.get_bounding_client_rect();
info!("r {r:?} ih {ih}");
if r.height() < ih { if r.height() < ih {
// The whole content fits in the window, no scrollbar // The whole content fits in the window, no scrollbar
orders.send_msg(Msg::SetProgress(0.)); orders.send_msg(Msg::SetProgress(0.));
@@ -553,12 +604,82 @@ pub fn update(msg: Msg, model: &mut Model, orders: &mut impl Orders<Msg>) {
"Server ({}) and client ({}) version mismatch, reloading", "Server ({}) and client ({}) version mismatch, reloading",
version, model.versions.client version, model.versions.client
); );
#[cfg(not(debug_assertions))]
orders.send_msg(Msg::Reload); orders.send_msg(Msg::Reload);
} }
model.versions.server = Some(version); model.versions.server = Some(version);
} }
Msg::CatchupStart => {
let query = if model.query.contains("is:unread") {
model.query.to_string()
} else {
format!("{} is:unread", model.query)
};
info!("starting catchup mode w/ {}", query);
orders.send_msg(Msg::ScrollToTop);
orders.send_msg(Msg::CatchupRequest { query });
}
Msg::CatchupKeepUnread => {
if let Some(thread_id) = current_thread_id(&model.context) {
orders.send_msg(Msg::SetUnread(thread_id, true));
};
orders.send_msg(Msg::CatchupNext);
}
Msg::CatchupMarkAsRead => {
if let Some(thread_id) = current_thread_id(&model.context) {
orders.send_msg(Msg::SetUnread(thread_id, false));
};
orders.send_msg(Msg::CatchupNext);
}
Msg::CatchupNext => {
orders.send_msg(Msg::ScrollToTop);
let Some(catchup) = &mut model.catchup else {
orders.send_msg(Msg::GoToSearchResults);
return;
};
let Some(idx) = catchup.items.iter().position(|i| !i.seen) else {
// All items have been seen
orders.send_msg(Msg::CatchupExit);
orders.send_msg(Msg::GoToSearchResults);
return;
};
catchup.items[idx].seen = true;
if idx < catchup.items.len() - 1 {
// Reached last item
orders.request_url(urls::thread(&catchup.items[idx + 1].id));
return;
} else {
orders.send_msg(Msg::CatchupExit);
orders.send_msg(Msg::GoToSearchResults);
return;
};
}
Msg::CatchupExit => {
orders.send_msg(Msg::ScrollToTop);
model.catchup = None;
}
} }
} }
fn current_thread_id(context: &Context) -> Option<String> {
match context {
Context::ThreadResult {
thread:
ShowThreadQueryThread::EmailThread(ShowThreadQueryThreadOnEmailThread {
thread_id, ..
}),
..
} => Some(thread_id.clone()),
Context::ThreadResult {
thread:
ShowThreadQueryThread::NewsPost(ShowThreadQueryThreadOnNewsPost { thread_id, .. }),
..
} => Some(thread_id.clone()),
_ => None,
}
}
// `Model` describes our app state. // `Model` describes our app state.
pub struct Model { pub struct Model {
pub query: String, pub query: String,
@@ -568,6 +689,8 @@ pub struct Model {
pub read_completion_ratio: f64, pub read_completion_ratio: f64,
pub content_el: ElRef<HtmlElement>, pub content_el: ElRef<HtmlElement>,
pub versions: Version, pub versions: Version,
pub catchup: Option<Catchup>,
pub last_url: Url,
} }
#[derive(Debug)] #[derive(Debug)]
@@ -604,6 +727,15 @@ pub enum Context {
}, },
} }
pub struct Catchup {
pub items: Vec<CatchupItem>,
}
pub struct CatchupItem {
pub id: String,
pub seen: bool,
}
pub struct Tag { pub struct Tag {
pub name: String, pub name: String,
pub bg_color: String, pub bg_color: String,
@@ -617,14 +749,15 @@ pub enum RefreshingState {
Error(String), Error(String),
} }
// `Msg` describes the different events you can modify state with. // `Msg` describes the different events you can modify state with.
#[derive(strum_macros::Display)]
pub enum Msg { pub enum Msg {
Noop, Noop,
// Tell the client to refresh its state // Tell the client to refresh its state
Refresh, Refresh,
// Tell the client to reload whole page from server // Tell the client to reload whole page from server
Reload, Reload,
// Window has changed size // TODO: add GoToUrl
OnResize, OnUrlChanged(subs::UrlChanged),
// Tell the server to update state // Tell the server to update state
RefreshStart, RefreshStart,
RefreshDone(Option<gloo_net::Error>), RefreshDone(Option<gloo_net::Error>),
@@ -654,12 +787,17 @@ pub enum Msg {
ShowThreadResult( ShowThreadResult(
Result<graphql_client::Response<graphql::show_thread_query::ResponseData>, gloo_net::Error>, Result<graphql_client::Response<graphql::show_thread_query::ResponseData>, gloo_net::Error>,
), ),
CatchupRequest {
query: String,
},
CatchupResult(
Result<graphql_client::Response<graphql::catchup_query::ResponseData>, gloo_net::Error>,
),
SelectionSetNone, SelectionSetNone,
SelectionSetAll, SelectionSetAll,
SelectionAddTag(String), SelectionAddTag(String),
#[allow(dead_code)] #[allow(dead_code)]
// TODO
SelectionRemoveTag(String), SelectionRemoveTag(String),
SelectionMarkAsRead, SelectionMarkAsRead,
SelectionMarkAsUnread, SelectionMarkAsUnread,
@@ -672,7 +810,14 @@ pub enum Msg {
CopyToClipboard(String), CopyToClipboard(String),
ScrollToTop,
WindowScrolled, WindowScrolled,
SetProgress(f64), SetProgress(f64),
UpdateServerVersion(String), UpdateServerVersion(String),
CatchupStart,
CatchupKeepUnread,
CatchupMarkAsRead,
CatchupNext,
CatchupExit,
} }

File diff suppressed because it is too large Load Diff

69
web/static/overrides.css Normal file
View File

@@ -0,0 +1,69 @@
html {
background-color: black;
}
.mail-thread a,
.news-post a {
color: var(--color-link) !important;
text-decoration: underline;
}
.mail-thread br,
.news-post br {
display: block;
margin-top: 1em;
content: " ";
}
.mail-thread h1,
.mail-thread h2,
.mail-thread h3,
.mail-thread h4,
.news-post h1,
.news-post h2,
.news-post h3,
.news-post h4 {
margin-top: 1em !important;
margin-bottom: 1em !important;
}
.mail-thread p,
.news-post p {
margin-bottom: 1em;
}
.mail-thread pre,
.mail-thread code,
.news-post pre,
.news-post code {
font-family: monospace;
background-color: #eee !important;
padding: 0.5em !important;
}
.mail-thread blockquote {
padding-left: 1em;
border-left: 2px solid #ddd;
}
.mail-thread ol,
.mail-thread ul {
margin-left: 2em;
}
/* Hackaday figures have unreadable black on dark grey */
.news-post figcaption.wp-caption-text {
background-color: initial !important;
}
.news-post.site-nautilus .article-ad,
.news-post.site-nautilus .primis-ad {
display: none !important;
}
.news-post.site-slashdot .story-byline {
display: block !important;
height: initial !important;
overflow: auto !important;
position: static !important;
}

42
web/static/vars.css Normal file
View File

@@ -0,0 +1,42 @@
:root {
--active-brightness: 0.85;
--border-radius: 5px;
--box-shadow: 2px 2px 10px;
--color-accent: #118bee15;
--color-bg: #fff;
--color-bg-secondary: #e9e9e9;
--color-link: #118bee;
--color-secondary: #920de9;
--color-secondary-accent: #920de90b;
--color-shadow: #f4f4f4;
--color-table: #118bee;
--color-text: #000;
--color-text-secondary: #999;
--color-scrollbar: #cacae8;
--font-family: -apple-system, BlinkMacSystemFont, "Segoe UI", Roboto, Oxygen-Sans, Ubuntu, Cantarell, "Helvetica Neue", sans-serif;
--hover-brightness: 1.2;
--justify-important: center;
--justify-normal: left;
--line-height: 1.5;
/*
--width-card: 285px;
--width-card-medium: 460px;
--width-card-wide: 800px;
*/
--width-content: 1080px;
}
@media (prefers-color-scheme: dark) {
:root[color-mode="user"] {
--color-accent: #0097fc4f;
--color-bg: #333;
--color-bg-secondary: #555;
--color-link: #0097fc;
--color-secondary: #e20de9;
--color-secondary-accent: #e20de94f;
--color-shadow: #bbbbbb20;
--color-table: #0097fc;
--color-text: #f7f7f7;
--color-text-secondary: #aaa;
}
}