Compare commits

...

113 Commits

Author SHA1 Message Date
67b7ff01df fix(deps): update all non-major dependencies
Some checks failed
renovate/artifacts Artifact file update failure
Continuous integration / Rustfmt (push) Waiting to run
Continuous integration / build (push) Waiting to run
Continuous integration / Disallow unused dependencies (push) Waiting to run
Continuous integration / Check (push) Failing after 1m2s
Continuous integration / Test Suite (push) Failing after 58s
Continuous integration / Trunk (push) Has been cancelled
2025-07-01 07:01:39 +00:00
ddb4c812ce chore(deps): lock file maintenance
All checks were successful
Continuous integration / Check (push) Successful in 48s
Continuous integration / Test Suite (push) Successful in 1m4s
Continuous integration / Trunk (push) Successful in 7m40s
Continuous integration / Rustfmt (push) Successful in 41s
Continuous integration / build (push) Successful in 1m26s
Continuous integration / Disallow unused dependencies (push) Successful in 2m13s
2025-06-30 00:01:45 +00:00
1aaf914ac5 chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 1m12s
Continuous integration / Test Suite (push) Successful in 2m0s
Continuous integration / Trunk (push) Successful in 1m13s
Continuous integration / Rustfmt (push) Successful in 51s
Continuous integration / build (push) Successful in 3m1s
Continuous integration / Disallow unused dependencies (push) Successful in 2m23s
2025-06-23 13:49:28 -07:00
982b5dae2f server: add disabled column to feed table
All checks were successful
Continuous integration / Check (push) Successful in 45s
Continuous integration / Test Suite (push) Successful in 1m7s
Continuous integration / Trunk (push) Successful in 1m7s
Continuous integration / Rustfmt (push) Successful in 54s
Continuous integration / build (push) Successful in 2m36s
Continuous integration / Disallow unused dependencies (push) Successful in 2m29s
2025-06-23 13:41:11 -07:00
8807c1b1f5 chore(deps): lock file maintenance
All checks were successful
Continuous integration / Check (push) Successful in 1m21s
Continuous integration / Test Suite (push) Successful in 1m32s
Continuous integration / Trunk (push) Successful in 1m19s
Continuous integration / Rustfmt (push) Successful in 1m4s
Continuous integration / build (push) Successful in 2m35s
Continuous integration / Disallow unused dependencies (push) Successful in 2m52s
2025-06-23 19:37:51 +00:00
fa23658ef0 web: remove now obsolete allow directive
All checks were successful
Continuous integration / Check (push) Successful in 1m21s
Continuous integration / Test Suite (push) Successful in 1m31s
Continuous integration / Trunk (push) Successful in 1m20s
Continuous integration / Rustfmt (push) Successful in 1m3s
Continuous integration / build (push) Successful in 3m30s
Continuous integration / Disallow unused dependencies (push) Successful in 2m41s
2025-06-23 12:32:23 -07:00
f175faed98 fix(deps): update rust crate css-inline to v0.14.5
All checks were successful
Continuous integration / Check (push) Successful in 39s
Continuous integration / Test Suite (push) Successful in 1m1s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / Rustfmt (push) Successful in 33s
Continuous integration / build (push) Successful in 53s
Continuous integration / Disallow unused dependencies (push) Successful in 2m2s
2025-06-16 21:46:30 +00:00
8971c16117 chore(deps): lock file maintenance
All checks were successful
Continuous integration / Check (push) Successful in 39s
Continuous integration / Test Suite (push) Successful in 50s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / Trunk (push) Successful in 50s
Continuous integration / build (push) Successful in 54s
Continuous integration / Disallow unused dependencies (push) Successful in 2m2s
2025-06-16 00:01:44 +00:00
fbecf564b5 fix(deps): update rust crate reqwest to v0.12.20
All checks were successful
Continuous integration / Check (push) Successful in 37s
Continuous integration / Test Suite (push) Successful in 43s
Continuous integration / Trunk (push) Successful in 37s
Continuous integration / Rustfmt (push) Successful in 37s
Continuous integration / build (push) Successful in 53s
Continuous integration / Disallow unused dependencies (push) Successful in 2m0s
2025-06-10 19:16:14 +00:00
e5643c6fd0 fix(deps): update rust crate clap to v4.5.40
All checks were successful
Continuous integration / Test Suite (push) Successful in 45s
Continuous integration / Trunk (push) Successful in 37s
Continuous integration / Check (push) Successful in 1m30s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / Disallow unused dependencies (push) Successful in 54s
Continuous integration / build (push) Successful in 1m50s
2025-06-09 18:31:15 +00:00
a8734269f7 chore(deps): lock file maintenance
All checks were successful
Continuous integration / Check (push) Successful in 40s
Continuous integration / Test Suite (push) Successful in 44s
Continuous integration / Rustfmt (push) Successful in 33s
Continuous integration / Trunk (push) Successful in 52s
Continuous integration / build (push) Successful in 51s
Continuous integration / Disallow unused dependencies (push) Successful in 2m2s
2025-06-09 00:01:43 +00:00
cab4e571f3 fix(deps): update all non-major dependencies
All checks were successful
Continuous integration / Check (push) Successful in 37s
Continuous integration / Test Suite (push) Successful in 1m13s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / Rustfmt (push) Successful in 28s
Continuous integration / build (push) Successful in 52s
Continuous integration / Disallow unused dependencies (push) Successful in 1m57s
2025-06-03 13:16:29 +00:00
4d6c6af7d9 fix(deps): update all non-major dependencies
All checks were successful
Continuous integration / Check (push) Successful in 38s
Continuous integration / Test Suite (push) Successful in 43s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / Rustfmt (push) Successful in 32s
Continuous integration / Disallow unused dependencies (push) Successful in 55s
Continuous integration / build (push) Successful in 1m44s
2025-06-02 12:47:12 +00:00
cf08831ed1 chore(deps): lock file maintenance
All checks were successful
Continuous integration / Check (push) Successful in 38s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / build (push) Successful in 50s
Continuous integration / Disallow unused dependencies (push) Successful in 55s
Continuous integration / Test Suite (push) Successful in 3m48s
2025-06-02 03:32:02 +00:00
e1509c5978 fix(deps): update all non-major dependencies
All checks were successful
Continuous integration / Check (push) Successful in 1m5s
Continuous integration / Test Suite (push) Successful in 44s
Continuous integration / Rustfmt (push) Successful in 32s
Continuous integration / Trunk (push) Successful in 1m18s
Continuous integration / build (push) Successful in 51s
Continuous integration / Disallow unused dependencies (push) Successful in 2m24s
2025-06-01 20:31:35 -07:00
13db8e6f1f chore(deps): lock file maintenance
All checks were successful
Continuous integration / Test Suite (push) Successful in 43s
Continuous integration / Check (push) Successful in 1m0s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / Rustfmt (push) Successful in 48s
Continuous integration / build (push) Successful in 53s
Continuous integration / Disallow unused dependencies (push) Successful in 2m0s
2025-06-02 02:46:35 +00:00
136a837fa4 chore(deps): lock file maintenance
All checks were successful
Continuous integration / Check (push) Successful in 46s
Continuous integration / Test Suite (push) Successful in 43s
Continuous integration / Rustfmt (push) Successful in 32s
Continuous integration / build (push) Successful in 1m8s
Continuous integration / Disallow unused dependencies (push) Successful in 54s
Continuous integration / Trunk (push) Successful in 7m14s
2025-06-02 00:01:42 +00:00
1ea058c664 fix(deps): update all non-major dependencies
All checks were successful
Continuous integration / Check (push) Successful in 40s
Continuous integration / Test Suite (push) Successful in 41s
Continuous integration / Trunk (push) Successful in 37s
Continuous integration / Rustfmt (push) Successful in 39s
Continuous integration / build (push) Successful in 49s
Continuous integration / Disallow unused dependencies (push) Successful in 2m0s
2025-05-28 16:16:24 +00:00
f4c11c5b3f fix(deps): update all non-major dependencies
All checks were successful
Continuous integration / Check (push) Successful in 40s
Continuous integration / Test Suite (push) Successful in 41s
Continuous integration / Trunk (push) Successful in 37s
Continuous integration / Rustfmt (push) Successful in 39s
Continuous integration / build (push) Successful in 53s
Continuous integration / Disallow unused dependencies (push) Successful in 2m1s
2025-05-28 13:01:55 +00:00
8dc8f3a0f8 chore(deps): lock file maintenance
All checks were successful
Continuous integration / Check (push) Successful in 38s
Continuous integration / Test Suite (push) Successful in 41s
Continuous integration / Rustfmt (push) Successful in 40s
Continuous integration / build (push) Successful in 2m56s
Continuous integration / Trunk (push) Successful in 3m43s
Continuous integration / Disallow unused dependencies (push) Successful in 2m2s
2025-05-26 00:01:31 +00:00
7b9450b65b fix(deps): update all non-major dependencies
All checks were successful
Continuous integration / Check (push) Successful in 54s
Continuous integration / Test Suite (push) Successful in 1m5s
Continuous integration / Trunk (push) Successful in 50s
Continuous integration / Rustfmt (push) Successful in 38s
Continuous integration / build (push) Successful in 1m23s
Continuous integration / Disallow unused dependencies (push) Successful in 1m51s
2025-05-24 14:47:03 +00:00
b5de0719dd fix(deps): update all non-major dependencies
All checks were successful
Continuous integration / Test Suite (push) Successful in 47s
Continuous integration / Check (push) Successful in 58s
Continuous integration / Trunk (push) Successful in 37s
Continuous integration / Rustfmt (push) Successful in 39s
Continuous integration / build (push) Successful in 52s
Continuous integration / Disallow unused dependencies (push) Successful in 2m0s
2025-05-24 02:31:52 +00:00
58da28a19b fix(deps): update all non-major dependencies
All checks were successful
Continuous integration / Test Suite (push) Successful in 39s
Continuous integration / Check (push) Successful in 51s
Continuous integration / Trunk (push) Successful in 37s
Continuous integration / Rustfmt (push) Successful in 35s
Continuous integration / build (push) Successful in 47s
Continuous integration / Disallow unused dependencies (push) Successful in 2m1s
2025-05-23 23:31:44 +00:00
75ad27ec2f chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 35s
Continuous integration / Test Suite (push) Successful in 40s
Continuous integration / Trunk (push) Successful in 37s
Continuous integration / Rustfmt (push) Successful in 44s
Continuous integration / build (push) Successful in 48s
Continuous integration / Disallow unused dependencies (push) Successful in 2m5s
2025-05-23 16:22:27 -07:00
f904fa0001 Add slurp and CSS for seiya-me 2025-05-23 16:21:57 -07:00
b94596bf65 fix(deps): update all non-major dependencies
All checks were successful
Continuous integration / Check (push) Successful in 37s
Continuous integration / Trunk (push) Successful in 37s
Continuous integration / Test Suite (push) Successful in 1m13s
Continuous integration / Rustfmt (push) Successful in 32s
Continuous integration / build (push) Successful in 1m19s
Continuous integration / Disallow unused dependencies (push) Successful in 57s
2025-05-22 15:01:32 +00:00
aa24599921 chore(deps): lock file maintenance
All checks were successful
Continuous integration / Check (push) Successful in 41s
Continuous integration / Test Suite (push) Successful in 40s
Continuous integration / Trunk (push) Successful in 41s
Continuous integration / Rustfmt (push) Successful in 38s
Continuous integration / build (push) Successful in 47s
Continuous integration / Disallow unused dependencies (push) Successful in 2m12s
2025-05-19 00:01:49 +00:00
c81a8c1cd3 chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 35s
Continuous integration / Test Suite (push) Successful in 40s
Continuous integration / Trunk (push) Successful in 37s
Continuous integration / Rustfmt (push) Successful in 39s
Continuous integration / build (push) Successful in 47s
Continuous integration / Disallow unused dependencies (push) Successful in 2m5s
2025-05-18 09:54:26 -07:00
7c3cfec3d1 web: improve keep unread logic in catchup, remove execess logging 2025-05-18 09:54:03 -07:00
a2920fde3b chore(deps): lock file maintenance
All checks were successful
Continuous integration / Check (push) Successful in 42s
Continuous integration / Test Suite (push) Successful in 2m51s
Continuous integration / Rustfmt (push) Successful in 37s
Continuous integration / Trunk (push) Successful in 4m0s
Continuous integration / Disallow unused dependencies (push) Successful in 58s
Continuous integration / build (push) Successful in 3m31s
2025-05-12 00:01:38 +00:00
8bc449ae6e fix(deps): update rust crate clap to v4.5.38
All checks were successful
Continuous integration / Check (push) Successful in 40s
Continuous integration / Trunk (push) Successful in 37s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / Test Suite (push) Successful in 2m0s
Continuous integration / build (push) Successful in 55s
Continuous integration / Disallow unused dependencies (push) Successful in 2m0s
2025-05-11 01:16:28 +00:00
0febd0535a fix(deps): update rust crate tower-http to v0.6.4
All checks were successful
Continuous integration / Check (push) Successful in 43s
Continuous integration / Test Suite (push) Successful in 49s
Continuous integration / Rustfmt (push) Successful in 32s
Continuous integration / build (push) Successful in 55s
Continuous integration / Disallow unused dependencies (push) Successful in 55s
Continuous integration / Trunk (push) Successful in 7m9s
2025-05-10 20:46:27 +00:00
a9e00a54e4 fix(deps): update rust crate tower-http to v0.6.3
All checks were successful
Continuous integration / Check (push) Successful in 1m3s
Continuous integration / Test Suite (push) Successful in 1m6s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / build (push) Successful in 56s
Continuous integration / Disallow unused dependencies (push) Successful in 55s
Continuous integration / Trunk (push) Successful in 7m14s
2025-05-07 19:46:07 +00:00
6811c689ff fix(deps): update rust crate tokio to v1.45.0
All checks were successful
Continuous integration / Check (push) Successful in 40s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / Test Suite (push) Successful in 2m18s
Continuous integration / build (push) Successful in 56s
Continuous integration / Disallow unused dependencies (push) Successful in 2m10s
2025-05-06 06:46:13 +00:00
8ba6b3d0b0 chore(deps): lock file maintenance
All checks were successful
Continuous integration / Check (push) Successful in 50s
Continuous integration / Test Suite (push) Successful in 49s
Continuous integration / Rustfmt (push) Successful in 32s
Continuous integration / build (push) Successful in 1m13s
Continuous integration / Disallow unused dependencies (push) Successful in 58s
Continuous integration / Trunk (push) Successful in 7m14s
2025-05-05 00:01:38 +00:00
a7c5585e80 fix(deps): update rust crate axum to v0.8.4
All checks were successful
Continuous integration / Check (push) Successful in 39s
Continuous integration / Trunk (push) Successful in 37s
Continuous integration / Rustfmt (push) Successful in 32s
Continuous integration / Test Suite (push) Successful in 2m10s
Continuous integration / build (push) Successful in 55s
Continuous integration / Disallow unused dependencies (push) Successful in 1m59s
2025-04-30 16:46:20 +00:00
4ef4d49113 fix(deps): update rust crate chrono to v0.4.41
All checks were successful
Continuous integration / Check (push) Successful in 41s
Continuous integration / Test Suite (push) Successful in 49s
Continuous integration / Rustfmt (push) Successful in 40s
Continuous integration / Trunk (push) Successful in 3m47s
Continuous integration / build (push) Successful in 3m24s
Continuous integration / Disallow unused dependencies (push) Successful in 55s
2025-04-29 09:31:11 +00:00
f8af303110 chore(deps): lock file maintenance
All checks were successful
Continuous integration / Check (push) Successful in 40s
Continuous integration / Test Suite (push) Successful in 1m16s
Continuous integration / Rustfmt (push) Successful in 34s
Continuous integration / build (push) Successful in 1m21s
Continuous integration / Disallow unused dependencies (push) Successful in 55s
Continuous integration / Trunk (push) Successful in 7m21s
2025-04-28 00:01:40 +00:00
fa5aac34ba chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 38s
Continuous integration / Test Suite (push) Successful in 44s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / Trunk (push) Successful in 55s
Continuous integration / build (push) Successful in 51s
Continuous integration / Disallow unused dependencies (push) Successful in 2m7s
2025-04-24 12:03:13 -07:00
b58556254e notmuch: log any stderr output 2025-04-24 12:02:55 -07:00
e365ced7dd server: more concise slice of ids 2025-04-24 12:02:40 -07:00
93d569fb14 chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 38s
Continuous integration / Test Suite (push) Successful in 44s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / Trunk (push) Successful in 51s
Continuous integration / build (push) Successful in 52s
Continuous integration / Disallow unused dependencies (push) Successful in 2m4s
2025-04-24 09:04:42 -07:00
f86a5f464d server: properly limit index 2025-04-24 09:04:22 -07:00
956c20b156 chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 38s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / Test Suite (push) Successful in 1m13s
Continuous integration / Rustfmt (push) Successful in 30s
Continuous integration / Disallow unused dependencies (push) Successful in 56s
Continuous integration / build (push) Successful in 1m34s
2025-04-24 08:56:56 -07:00
1eb498712b server: prevent out of bounds index at end of processing 2025-04-24 08:56:19 -07:00
f12979c0be chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 38s
Continuous integration / Test Suite (push) Successful in 45s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / build (push) Successful in 52s
Continuous integration / Disallow unused dependencies (push) Successful in 55s
Continuous integration / Trunk (push) Successful in 7m10s
2025-04-23 18:59:16 -07:00
4665f34e54 server: label_unprocessed handle case where files cannot be found from message-id 2025-04-23 18:57:54 -07:00
bbdc35061c chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 38s
Continuous integration / Test Suite (push) Successful in 45s
Continuous integration / Trunk (push) Successful in 37s
Continuous integration / Rustfmt (push) Successful in 39s
Continuous integration / build (push) Successful in 52s
Continuous integration / Disallow unused dependencies (push) Successful in 2m9s
2025-04-23 15:25:34 -07:00
f11f0b4d23 server: migrate all use of log to tracing 2025-04-23 15:25:11 -07:00
c7c47e4a73 chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 38s
Continuous integration / Test Suite (push) Successful in 44s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / Rustfmt (push) Successful in 39s
Continuous integration / build (push) Successful in 51s
Continuous integration / Disallow unused dependencies (push) Successful in 2m5s
2025-04-23 14:57:39 -07:00
c3835522b2 server: add Letterbox/Bad label to unparsable emails, and consider them processed 2025-04-23 14:57:13 -07:00
dfa80f9046 chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 38s
Continuous integration / Test Suite (push) Successful in 55s
Continuous integration / Trunk (push) Successful in 52s
Continuous integration / Rustfmt (push) Successful in 39s
Continuous integration / Disallow unused dependencies (push) Successful in 58s
Continuous integration / build (push) Successful in 1m32s
2025-04-23 14:41:25 -07:00
b8dfdabf8d server: more tracing and logging 2025-04-23 14:41:11 -07:00
bbcf52b006 chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 38s
Continuous integration / Test Suite (push) Successful in 43s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / Trunk (push) Successful in 51s
Continuous integration / build (push) Successful in 51s
Continuous integration / Disallow unused dependencies (push) Successful in 2m5s
2025-04-23 11:38:48 -07:00
f92c05cd28 server: return ids processed from send_refresh_websocket_handler 2025-04-23 11:38:30 -07:00
885bbe0a8c chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 40s
Continuous integration / Test Suite (push) Successful in 45s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / Trunk (push) Successful in 52s
Continuous integration / build (push) Successful in 55s
Continuous integration / Disallow unused dependencies (push) Successful in 2m5s
2025-04-23 11:09:19 -07:00
8b1d111837 chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 38s
Continuous integration / Trunk (push) Successful in 37s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / Test Suite (push) Successful in 1m36s
Continuous integration / build (push) Successful in 51s
Continuous integration / Disallow unused dependencies (push) Successful in 1m59s
2025-04-23 11:02:46 -07:00
08abf31fa9 server: always remove unprocessed label when processing rules 2025-04-23 11:02:29 -07:00
fa99959508 chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 38s
Continuous integration / Test Suite (push) Successful in 46s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / Rustfmt (push) Successful in 38s
Continuous integration / build (push) Successful in 53s
Continuous integration / Disallow unused dependencies (push) Successful in 1m55s
2025-04-23 09:31:43 -07:00
0f6af0f475 server: more debug prints 2025-04-23 09:31:25 -07:00
4c486e9168 chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 38s
Continuous integration / Test Suite (push) Successful in 45s
Continuous integration / Rustfmt (push) Successful in 32s
Continuous integration / Trunk (push) Successful in 53s
Continuous integration / build (push) Successful in 51s
Continuous integration / Disallow unused dependencies (push) Successful in 2m4s
2025-04-22 22:43:37 -07:00
109d380ea7 server: remove inbox on no-match 2025-04-22 22:43:22 -07:00
4244fa0d82 chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 38s
Continuous integration / Test Suite (push) Successful in 45s
Continuous integration / Rustfmt (push) Successful in 33s
Continuous integration / Trunk (push) Successful in 51s
Continuous integration / build (push) Successful in 53s
Continuous integration / Disallow unused dependencies (push) Successful in 2m1s
2025-04-22 22:41:26 -07:00
4b15e71893 server: remove unprocessed appropriately 2025-04-22 22:41:09 -07:00
1bbebad01b chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 41s
Continuous integration / Test Suite (push) Successful in 45s
Continuous integration / Rustfmt (push) Successful in 32s
Continuous integration / Trunk (push) Successful in 51s
Continuous integration / build (push) Successful in 53s
Continuous integration / Disallow unused dependencies (push) Successful in 2m5s
2025-04-22 22:28:20 -07:00
27edffd090 Set version for all packages 2025-04-22 22:28:03 -07:00
08212a9f78 chore: Release 2025-04-22 22:26:17 -07:00
877ec6c4b0 server: drop version requirement 2025-04-22 22:26:03 -07:00
3ce92d6bdf chore: Release 2025-04-22 22:24:37 -07:00
1a28bb2021 Use path for notmuch crate 2025-04-22 22:24:07 -07:00
b86f72f75c chore: Release 2025-04-22 22:20:00 -07:00
1a8b98d420 Use relative import for notmuch 2025-04-22 22:19:45 -07:00
383a7d800f chore: Release 2025-04-22 22:18:50 -07:00
453561140a server: batch tag changes and add default Grey tag 2025-04-22 22:18:24 -07:00
f6d5d3755b chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 38s
Continuous integration / Test Suite (push) Successful in 46s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / Rustfmt (push) Successful in 36s
Continuous integration / build (push) Successful in 52s
Continuous integration / Disallow unused dependencies (push) Successful in 2m8s
2025-04-22 21:24:53 -07:00
5226fe090e server & web: run label_unprocessed before notifying web client 2025-04-22 21:22:50 -07:00
c10ad00ca7 chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 38s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / Test Suite (push) Successful in 1m18s
Continuous integration / Rustfmt (push) Successful in 32s
Continuous integration / Disallow unused dependencies (push) Successful in 56s
Continuous integration / build (push) Successful in 1m39s
2025-04-22 17:52:04 -07:00
64fc92c3d6 web: refresh including the server side on websocket reconnect 2025-04-22 17:51:53 -07:00
b9c116d5b6 server: mark spam as read 2025-04-22 17:51:53 -07:00
007200b37b fix(deps): update rust crate xtracing to v0.3.2
All checks were successful
Continuous integration / Check (push) Successful in 37s
Continuous integration / Test Suite (push) Successful in 44s
Continuous integration / Trunk (push) Successful in 37s
Continuous integration / Rustfmt (push) Successful in 39s
Continuous integration / build (push) Successful in 51s
Continuous integration / Disallow unused dependencies (push) Successful in 2m7s
2025-04-22 23:01:17 +00:00
9824ad1e18 chore(deps): lock file maintenance
All checks were successful
Continuous integration / Test Suite (push) Successful in 45s
Continuous integration / Check (push) Successful in 46s
Continuous integration / Trunk (push) Successful in 37s
Continuous integration / Rustfmt (push) Successful in 38s
Continuous integration / build (push) Successful in 51s
Continuous integration / Disallow unused dependencies (push) Successful in 2m7s
2025-04-22 15:16:24 +00:00
a8819c7551 gitea: use nightly when doing trunk build
All checks were successful
Continuous integration / Test Suite (push) Successful in 44s
Continuous integration / Check (push) Successful in 1m23s
Continuous integration / Rustfmt (push) Successful in 37s
Continuous integration / Trunk (push) Successful in 3m47s
Continuous integration / Disallow unused dependencies (push) Successful in 55s
Continuous integration / build (push) Successful in 3m45s
2025-04-22 08:13:38 -07:00
8cdfbdd08f chore: Release
Some checks failed
Continuous integration / build (push) Has been cancelled
Continuous integration / Disallow unused dependencies (push) Has been cancelled
Continuous integration / Rustfmt (push) Has been cancelled
Continuous integration / Trunk (push) Has been cancelled
Continuous integration / Test Suite (push) Has been cancelled
Continuous integration / Check (push) Has been cancelled
2025-04-22 07:59:42 -07:00
b2d1dc9276 cargo update && cargp upgrade 2025-04-22 07:59:12 -07:00
1f79b43a85 chore: Release
Some checks failed
Continuous integration / Check (push) Successful in 37s
Continuous integration / Test Suite (push) Successful in 44s
Continuous integration / Trunk (push) Failing after 36s
Continuous integration / Rustfmt (push) Successful in 38s
Continuous integration / build (push) Successful in 51s
Continuous integration / Disallow unused dependencies (push) Successful in 1m57s
2025-04-21 22:01:49 -07:00
904619bccd chore: Release 2025-04-21 22:01:41 -07:00
14104f6469 Remove non hermetic default flage values
Some checks failed
Continuous integration / Test Suite (push) Successful in 57s
Continuous integration / Trunk (push) Failing after 38s
Continuous integration / Check (push) Successful in 2m1s
Continuous integration / Rustfmt (push) Successful in 32s
Continuous integration / Disallow unused dependencies (push) Successful in 1m14s
Continuous integration / build (push) Successful in 1m37s
2025-04-21 21:59:22 -07:00
dccfb6f71f chore: Release
Some checks failed
Continuous integration / Check (push) Failing after 36s
Continuous integration / Test Suite (push) Failing after 43s
Continuous integration / Trunk (push) Failing after 36s
Continuous integration / Rustfmt (push) Successful in 37s
Continuous integration / build (push) Failing after 51s
Continuous integration / Disallow unused dependencies (push) Failing after 1m58s
2025-04-21 21:20:51 -07:00
547266a705 Fix imports for letterbox-* packages 2025-04-21 21:20:31 -07:00
273562b58c chore: Release 2025-04-21 21:16:43 -07:00
dc39eed1a7 cargo sqlx prepare 2025-04-21 21:16:42 -07:00
9178badfd0 Add mail tagging support 2025-04-21 21:15:55 -07:00
38e75ec251 web: make random emoji selection more deterministic 2025-04-21 10:12:12 -07:00
c1496bf87b server: doc cleanup 2025-04-20 10:48:59 -07:00
4da888b240 Move id format check from server into notmuch 2025-04-20 10:47:40 -07:00
c703be2ca5 server: more robust view original serving 2025-04-20 10:01:22 -07:00
5cec8add5e chore: Release
Some checks failed
Continuous integration / Check (push) Successful in 42s
Continuous integration / Trunk (push) Failing after 38s
Continuous integration / Test Suite (push) Successful in 1m20s
Continuous integration / Rustfmt (push) Successful in 32s
Continuous integration / build (push) Successful in 1m20s
Continuous integration / Disallow unused dependencies (push) Successful in 1m0s
2025-04-20 09:46:49 -07:00
0225dbde3a procmail2notmuch: don't run migration code, leave it to server 2025-04-20 09:46:27 -07:00
f84b8fa6c2 chore: Release 2025-04-20 09:38:35 -07:00
979cbcd23e procmail2notmuch: inlude early exit option 2025-04-20 09:37:51 -07:00
b3070e1919 web: use random emoji when search results empty, handle search vs catchup 2025-04-20 09:37:12 -07:00
e5fdde8f30 web: add graphic when search results are empty 2025-04-20 09:07:43 -07:00
7de36bbc3d procmail2notmuch: add sql rule loader 2025-04-20 08:40:06 -07:00
1c4f27902e server: add todo 2025-04-20 08:39:47 -07:00
7ee86f0d2f chore: Release
Some checks failed
Continuous integration / Check (push) Successful in 36s
Continuous integration / Test Suite (push) Successful in 41s
Continuous integration / Trunk (push) Failing after 35s
Continuous integration / Rustfmt (push) Successful in 38s
Continuous integration / build (push) Successful in 47s
Continuous integration / Disallow unused dependencies (push) Successful in 1m57s
2025-04-19 13:19:14 -07:00
a0b06fd5ef chore: Release 2025-04-19 13:17:01 -07:00
630bb20b35 procmail2notmuch: add debug vs notmuchrc modes 2025-04-19 13:16:47 -07:00
17ea2a35cb web: tweak style and behavior of view original link 2025-04-19 13:11:57 -07:00
7d9376d607 Add view original functionality 2025-04-19 12:33:11 -07:00
122e949072 chore: Release
Some checks failed
Continuous integration / Test Suite (push) Successful in 41s
Continuous integration / Check (push) Successful in 1m33s
Continuous integration / Trunk (push) Failing after 35s
Continuous integration / Rustfmt (push) Successful in 56s
Continuous integration / build (push) Successful in 48s
Continuous integration / Disallow unused dependencies (push) Successful in 3m14s
2025-04-16 08:48:35 -07:00
9a69b4c51e web: scroll to top on pagination 2025-04-16 08:47:45 -07:00
251151244b chore: Release
Some checks failed
Continuous integration / Check (push) Successful in 1m29s
Continuous integration / Test Suite (push) Successful in 41s
Continuous integration / Rustfmt (push) Successful in 30s
Continuous integration / Trunk (push) Failing after 1m9s
Continuous integration / build (push) Successful in 47s
Continuous integration / Disallow unused dependencies (push) Successful in 3m19s
2025-04-15 20:38:08 -07:00
9d232b666b server: add debug message for WS connection 2025-04-15 20:37:35 -07:00
27 changed files with 1631 additions and 1157 deletions

View File

@@ -26,7 +26,7 @@ jobs:
- uses: actions/checkout@v4 - uses: actions/checkout@v4
- uses: actions-rust-lang/setup-rust-toolchain@v1 - uses: actions-rust-lang/setup-rust-toolchain@v1
with: with:
toolchain: stable toolchain: nightly
target: wasm32-unknown-unknown target: wasm32-unknown-unknown
- run: cargo install trunk - run: cargo install trunk
- run: cd web; trunk build - run: cd web; trunk build

1420
Cargo.lock generated

File diff suppressed because it is too large Load Diff

View File

@@ -8,7 +8,7 @@ authors = ["Bill Thiede <git@xinu.tv>"]
edition = "2021" edition = "2021"
license = "UNLICENSED" license = "UNLICENSED"
publish = ["xinu"] publish = ["xinu"]
version = "0.15.5" version = "0.17.26"
repository = "https://git.z.xinu.tv/wathiede/letterbox" repository = "https://git.z.xinu.tv/wathiede/letterbox"
[profile.dev] [profile.dev]

View File

@@ -11,14 +11,14 @@ version.workspace = true
[dependencies] [dependencies]
log = "0.4.14" log = "0.4.27"
mailparse = "0.16.0" mailparse = "0.16.1"
serde = { version = "1.0", features = ["derive"] } serde = { version = "1.0", features = ["derive"] }
serde_json = { version = "1.0", features = ["unbounded_depth"] } serde_json = { version = "1.0", features = ["unbounded_depth"] }
thiserror = "2.0.0" thiserror = "2.0.12"
tracing = "0.1.41" tracing = "0.1.41"
[dev-dependencies] [dev-dependencies]
itertools = "0.14.0" itertools = "0.14.0"
pretty_assertions = "1" pretty_assertions = "1"
rayon = "1.5" rayon = "1.10"

View File

@@ -214,9 +214,8 @@ use std::{
process::Command, process::Command,
}; };
use log::{error, info};
use serde::{Deserialize, Serialize}; use serde::{Deserialize, Serialize};
use tracing::instrument; use tracing::{error, info, instrument, warn};
/// # Number of seconds since the Epoch /// # Number of seconds since the Epoch
pub type UnixTime = isize; pub type UnixTime = isize;
@@ -503,15 +502,28 @@ impl Notmuch {
self.tags_for_query("*") self.tags_for_query("*")
} }
#[instrument(skip_all, fields(tag=tag,search_term=search_term))]
pub fn tag_add(&self, tag: &str, search_term: &str) -> Result<(), NotmuchError> { pub fn tag_add(&self, tag: &str, search_term: &str) -> Result<(), NotmuchError> {
self.run_notmuch(["tag", &format!("+{tag}"), search_term])?; self.tags_add(tag, &[search_term])
}
#[instrument(skip_all, fields(tag=tag,search_term=?search_term))]
pub fn tags_add(&self, tag: &str, search_term: &[&str]) -> Result<(), NotmuchError> {
let tag = format!("+{tag}");
let mut args = vec!["tag", &tag];
args.extend(search_term);
self.run_notmuch(&args)?;
Ok(()) Ok(())
} }
#[instrument(skip_all, fields(tag=tag,search_term=search_term))]
pub fn tag_remove(&self, tag: &str, search_term: &str) -> Result<(), NotmuchError> { pub fn tag_remove(&self, tag: &str, search_term: &str) -> Result<(), NotmuchError> {
self.run_notmuch(["tag", &format!("-{tag}"), search_term])?; self.tags_remove(tag, &[search_term])
}
#[instrument(skip_all, fields(tag=tag,search_term=?search_term))]
pub fn tags_remove(&self, tag: &str, search_term: &[&str]) -> Result<(), NotmuchError> {
let tag = format!("-{tag}");
let mut args = vec!["tag", &tag];
args.extend(search_term);
self.run_notmuch(&args)?;
Ok(()) Ok(())
} }
@@ -598,6 +610,11 @@ impl Notmuch {
#[instrument(skip_all, fields(id=id,part=part))] #[instrument(skip_all, fields(id=id,part=part))]
pub fn show_original_part(&self, id: &MessageId, part: usize) -> Result<Vec<u8>, NotmuchError> { pub fn show_original_part(&self, id: &MessageId, part: usize) -> Result<Vec<u8>, NotmuchError> {
let id = if id.starts_with("id:") {
id
} else {
&format!("id:{id}")
};
let res = self.run_notmuch(["show", "--part", &part.to_string(), id])?; let res = self.run_notmuch(["show", "--part", &part.to_string(), id])?;
Ok(res) Ok(res)
} }
@@ -700,6 +717,13 @@ impl Notmuch {
cmd.args(args); cmd.args(args);
info!("{:?}", &cmd); info!("{:?}", &cmd);
let out = cmd.output()?; let out = cmd.output()?;
if !out.stderr.is_empty() {
warn!(
"{:?}: STDERR:\n{}",
&cmd,
String::from_utf8_lossy(&out.stderr)
);
}
Ok(out.stdout) Ok(out.stdout)
} }
} }

View File

@@ -11,4 +11,10 @@ version.workspace = true
# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html # See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html
[dependencies] [dependencies]
anyhow = "1.0.69" anyhow = "1.0.98"
clap = { version = "4.5.37", features = ["derive", "env"] }
letterbox-notmuch = { version = "0.17.9", registry = "xinu" }
letterbox-shared = { version = "0.17.9", registry = "xinu" }
serde = { version = "1.0.219", features = ["derive"] }
sqlx = { version = "0.8.5", features = ["postgres", "runtime-tokio"] }
tokio = { version = "1.44.2", features = ["rt", "macros", "rt-multi-thread"] }

View File

@@ -1,210 +1,36 @@
use std::{convert::Infallible, io::Write, str::FromStr}; use std::{collections::HashMap, io::Write};
#[derive(Debug, Default)] use clap::{Parser, Subcommand};
enum MatchType { use letterbox_shared::{cleanup_match, Match, MatchType, Rule};
From, use sqlx::{types::Json, PgPool};
Sender,
To, #[derive(Debug, Subcommand)]
Cc, enum Mode {
Subject, Debug,
List, Notmuchrc,
DeliveredTo, LoadSql {
XForwardedTo, #[arg(short, long)]
ReplyTo, dsn: String,
XOriginalTo, },
XSpam,
Body,
#[default]
Unknown,
}
#[derive(Debug, Default)]
struct Match {
match_type: MatchType,
needle: String,
} }
#[derive(Debug, Default)] /// Simple program to greet a person
struct Rule { #[derive(Parser, Debug)]
matches: Vec<Match>, #[command(version, about, long_about = None)]
tags: Vec<String>, struct Args {
#[arg(short, long, default_value = "/home/wathiede/dotfiles/procmailrc")]
input: String,
#[command(subcommand)]
mode: Mode,
} }
fn unescape(s: &str) -> String { #[tokio::main]
s.replace('\\', "") async fn main() -> anyhow::Result<()> {
} let args = Args::parse();
fn cleanup_match(prefix: &str, s: &str) -> String {
unescape(&s[prefix.len()..]).replace(".*", "")
}
mod matches {
pub const TO: &'static str = "TO";
pub const CC: &'static str = "Cc";
pub const TOCC: &'static str = "(TO|Cc)";
pub const FROM: &'static str = "From";
pub const SENDER: &'static str = "Sender";
pub const SUBJECT: &'static str = "Subject";
pub const DELIVERED_TO: &'static str = "Delivered-To";
pub const X_FORWARDED_TO: &'static str = "X-Forwarded-To";
pub const REPLY_TO: &'static str = "Reply-To";
pub const X_ORIGINAL_TO: &'static str = "X-Original-To";
pub const LIST_ID: &'static str = "List-ID";
pub const X_SPAM: &'static str = "X-Spam";
pub const X_SPAM_FLAG: &'static str = "X-Spam-Flag";
}
impl FromStr for Match {
type Err = Infallible;
fn from_str(s: &str) -> Result<Self, Self::Err> {
// Examples:
// "* 1^0 ^TOsonyrewards.com@xinu.tv"
// "* ^TOsonyrewards.com@xinu.tv"
let mut it = s.split_whitespace().skip(1);
let mut needle = it.next().unwrap();
if needle == "1^0" {
needle = it.next().unwrap();
}
let mut needle = vec![needle];
needle.extend(it);
let needle = needle.join(" ");
let first = needle.chars().nth(0).unwrap_or(' ');
use matches::*;
if first == '^' {
let needle = &needle[1..];
if needle.starts_with(TO) {
return Ok(Match {
match_type: MatchType::To,
needle: cleanup_match(TO, needle),
});
} else if needle.starts_with(FROM) {
return Ok(Match {
match_type: MatchType::From,
needle: cleanup_match(FROM, needle),
});
} else if needle.starts_with(CC) {
return Ok(Match {
match_type: MatchType::Cc,
needle: cleanup_match(CC, needle),
});
} else if needle.starts_with(TOCC) {
return Ok(Match {
match_type: MatchType::To,
needle: cleanup_match(TOCC, needle),
});
} else if needle.starts_with(SENDER) {
return Ok(Match {
match_type: MatchType::Sender,
needle: cleanup_match(SENDER, needle),
});
} else if needle.starts_with(SUBJECT) {
return Ok(Match {
match_type: MatchType::Subject,
needle: cleanup_match(SUBJECT, needle),
});
} else if needle.starts_with(X_ORIGINAL_TO) {
return Ok(Match {
match_type: MatchType::XOriginalTo,
needle: cleanup_match(X_ORIGINAL_TO, needle),
});
} else if needle.starts_with(LIST_ID) {
return Ok(Match {
match_type: MatchType::List,
needle: cleanup_match(LIST_ID, needle),
});
} else if needle.starts_with(REPLY_TO) {
return Ok(Match {
match_type: MatchType::ReplyTo,
needle: cleanup_match(REPLY_TO, needle),
});
} else if needle.starts_with(X_SPAM_FLAG) {
return Ok(Match {
match_type: MatchType::XSpam,
needle: '*'.to_string(),
});
} else if needle.starts_with(X_SPAM) {
return Ok(Match {
match_type: MatchType::XSpam,
needle: '*'.to_string(),
});
} else if needle.starts_with(DELIVERED_TO) {
return Ok(Match {
match_type: MatchType::DeliveredTo,
needle: cleanup_match(DELIVERED_TO, needle),
});
} else if needle.starts_with(X_FORWARDED_TO) {
return Ok(Match {
match_type: MatchType::XForwardedTo,
needle: cleanup_match(X_FORWARDED_TO, needle),
});
} else {
unreachable!("needle: '{needle}'")
}
} else {
return Ok(Match {
match_type: MatchType::Body,
needle: cleanup_match("", &needle),
});
}
}
}
fn notmuch_from_rules<W: Write>(mut w: W, rules: &[Rule]) -> anyhow::Result<()> {
// TODO(wathiede): if reindexing this many tags is too slow, see if combining rules per tag is
// faster.
let mut lines = Vec::new();
for r in rules {
for m in &r.matches {
for t in &r.tags {
if let MatchType::Unknown = m.match_type {
eprintln!("rule has unknown match {:?}", r);
continue;
}
let rule = match m.match_type {
MatchType::From => "from:",
// TODO(wathiede): something more specific?
MatchType::Sender => "from:",
MatchType::To => "to:",
MatchType::Cc => "to:",
MatchType::Subject => "subject:",
MatchType::List => "List-ID:",
MatchType::Body => "",
// TODO(wathiede): these will probably require adding fields to notmuch
// index. Handle them later.
MatchType::DeliveredTo
| MatchType::XForwardedTo
| MatchType::ReplyTo
| MatchType::XOriginalTo
| MatchType::XSpam => continue,
MatchType::Unknown => unreachable!(),
};
// Preserve unread status if run with --remove-all
lines.push(format!(
r#"-unprocessed +{} +unread -- is:unread tag:unprocessed {}"{}""#,
t, rule, m.needle
));
lines.push(format!(
// TODO(wathiede): this assumes `notmuch new` is configured to add
// `tag:unprocessed` to all new mail.
r#"-unprocessed +{} -- tag:unprocessed {}"{}""#,
t, rule, m.needle
));
}
}
}
lines.sort();
for l in lines {
writeln!(w, "{l}")?;
}
Ok(())
}
fn main() -> anyhow::Result<()> {
let input = "/home/wathiede/dotfiles/procmailrc";
let mut rules = Vec::new(); let mut rules = Vec::new();
let mut cur_rule = Rule::default(); let mut cur_rule = Rule::default();
for l in std::fs::read_to_string(input)?.lines() { for l in std::fs::read_to_string(args.input)?.lines() {
let l = if let Some(idx) = l.find('#') { let l = if let Some(idx) = l.find('#') {
&l[..idx] &l[..idx]
} else { } else {
@@ -222,6 +48,9 @@ fn main() -> anyhow::Result<()> {
match first { match first {
':' => { ':' => {
// start of rule // start of rule
// If carbon-copy flag present, don't stop on match
cur_rule.stop_on_match = !l.contains('c');
} }
'*' => { '*' => {
// add to current rule // add to current rule
@@ -230,26 +59,119 @@ fn main() -> anyhow::Result<()> {
} }
'.' => { '.' => {
// delivery to folder // delivery to folder
cur_rule.tags.push(cleanup_match( cur_rule.tag = cleanup_match(
"", "",
&l.replace('.', "/") &l.replace('.', "/")
.replace(' ', "") .replace(' ', "")
.trim_matches('/') .trim_matches('/')
.to_string(), .to_string(),
)); );
rules.push(cur_rule); rules.push(cur_rule);
cur_rule = Rule::default(); cur_rule = Rule::default();
} }
'/' => cur_rule = Rule::default(), // Ex. /dev/null
'|' => cur_rule = Rule::default(), // external command '|' => cur_rule = Rule::default(), // external command
'$' => { '$' => {
// TODO(wathiede): tag messages with no other tag as 'inbox' // TODO(wathiede): tag messages with no other tag as 'inbox'
cur_rule.tags.push(cleanup_match("", "inbox")); cur_rule.tag = cleanup_match("", "inbox");
rules.push(cur_rule); rules.push(cur_rule);
cur_rule = Rule::default(); cur_rule = Rule::default();
} // variable, should only be $DEFAULT in my config } // variable, should only be $DEFAULT in my config
_ => panic!("Unhandled first character '{}' {}", first, l), _ => panic!("Unhandled first character '{}'\nLine: {}", first, l),
} }
} }
notmuch_from_rules(std::io::stdout(), &rules)?; match args.mode {
Mode::Debug => print_rules(&rules),
Mode::Notmuchrc => notmuch_from_rules(std::io::stdout(), &rules)?,
Mode::LoadSql { dsn } => load_sql(&dsn, &rules).await?,
}
Ok(())
}
fn print_rules(rules: &[Rule]) {
let mut tally = HashMap::new();
for r in rules {
for m in &r.matches {
*tally.entry(m.match_type).or_insert(0) += 1;
}
}
let mut sorted: Vec<_> = tally.iter().map(|(k, v)| (v, k)).collect();
sorted.sort();
sorted.reverse();
for (v, k) in sorted {
println!("{k:?}: {v}");
}
}
fn notmuch_from_rules<W: Write>(mut w: W, rules: &[Rule]) -> anyhow::Result<()> {
// TODO(wathiede): if reindexing this many tags is too slow, see if combining rules per tag is
// faster.
let mut lines = Vec::new();
for r in rules {
for m in &r.matches {
let t = &r.tag;
if let MatchType::Unknown = m.match_type {
eprintln!("rule has unknown match {:?}", r);
continue;
}
let rule = match m.match_type {
MatchType::From => "from:",
// TODO(wathiede): something more specific?
MatchType::Sender => "from:",
MatchType::To => "to:",
MatchType::Cc => "to:",
MatchType::Subject => "subject:",
MatchType::ListId => "List-ID:",
MatchType::Body => "",
// TODO(wathiede): these will probably require adding fields to notmuch
// index. Handle them later.
MatchType::DeliveredTo
| MatchType::XForwardedTo
| MatchType::ReplyTo
| MatchType::XOriginalTo
| MatchType::XSpam => continue,
MatchType::Unknown => unreachable!(),
};
// Preserve unread status if run with --remove-all
lines.push(format!(
r#"-unprocessed +{} +unread -- is:unread tag:unprocessed {}"{}""#,
t, rule, m.needle
));
lines.push(format!(
// TODO(wathiede): this assumes `notmuch new` is configured to add
// `tag:unprocessed` to all new mail.
r#"-unprocessed +{} -- tag:unprocessed {}"{}""#,
t, rule, m.needle
));
}
}
lines.sort();
for l in lines {
writeln!(w, "{l}")?;
}
Ok(())
}
async fn load_sql(dsn: &str, rules: &[Rule]) -> anyhow::Result<()> {
let pool = PgPool::connect(dsn).await?;
println!("clearing email_rule table");
sqlx::query!("DELETE FROM email_rule")
.execute(&pool)
.await?;
for (order, rule) in rules.iter().enumerate() {
println!("inserting {order}: {rule:?}");
sqlx::query!(
r#"
INSERT INTO email_rule (sort_order, rule)
VALUES ($1, $2)
"#,
order as i32,
Json(rule) as _
)
.execute(&pool)
.await?;
}
Ok(()) Ok(())
} }

View File

@@ -0,0 +1,20 @@
{
"db_name": "PostgreSQL",
"query": "\n SELECT rule as \"rule: Json<Rule>\"\n FROM email_rule\n ORDER BY sort_order\n ",
"describe": {
"columns": [
{
"ordinal": 0,
"name": "rule: Json<Rule>",
"type_info": "Jsonb"
}
],
"parameters": {
"Left": []
},
"nullable": [
false
]
},
"hash": "6c5b0a96f45f78795732ea428cc01b4eab28b7150aa37387e7439a6b0b58e88c"
}

View File

@@ -12,48 +12,47 @@ version.workspace = true
# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html # See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html
[dependencies] [dependencies]
ammonia = "4.0.0" ammonia = "4.1.0"
anyhow = "1.0.79" anyhow = "1.0.98"
async-graphql = { version = "7", features = ["log"] } async-graphql = { version = "7", features = ["log"] }
async-graphql-axum = "7.0.15" async-graphql-axum = "7.0.16"
async-trait = "0.1.81" async-trait = "0.1.88"
axum = { version = "0.8.3", features = ["ws"] } axum = { version = "0.8.3", features = ["ws"] }
axum-macros = "0.5.0" axum-macros = "0.5.0"
build-info = "0.0.40" build-info = "0.0.41"
cacher = { version = "0.2.0", registry = "xinu" } cacher = { version = "0.2.0", registry = "xinu" }
chrono = "0.4.39" chrono = "0.4.40"
clap = { version = "4.5.36", features = ["derive"] } clap = { version = "4.5.37", features = ["derive"] }
css-inline = "0.14.0" css-inline = "0.15.0"
futures = "0.3.31" futures = "0.3.31"
headers = "0.4.0" headers = "0.4.0"
html-escape = "0.2.13" html-escape = "0.2.13"
letterbox-notmuch = { version = "0.15.5", path = "../notmuch", registry = "xinu" } letterbox-notmuch = { path = "../notmuch", version = "0.17.26", registry = "xinu" }
letterbox-shared = { version = "0.15.5", path = "../shared", registry = "xinu" } letterbox-shared = { path = "../shared", version = "0.17.26", registry = "xinu" }
linkify = "0.10.0" linkify = "0.10.0"
log = "0.4.17" lol_html = "2.3.0"
lol_html = "2.0.0" mailparse = "0.16.1"
mailparse = "0.16.0"
maplit = "1.0.2" maplit = "1.0.2"
memmap = "0.7.0" memmap = "0.7.0"
regex = "1.11.1" regex = "1.11.1"
reqwest = { version = "0.12.7", features = ["blocking"] } reqwest = { version = "0.12.15", features = ["blocking"] }
scraper = "0.23.0" scraper = "0.23.1"
serde = { version = "1.0.147", features = ["derive"] } serde = { version = "1.0.219", features = ["derive"] }
serde_json = "1.0.87" serde_json = "1.0.140"
sqlx = { version = "0.8.2", features = ["postgres", "runtime-tokio", "time"] } sqlx = { version = "0.8.5", features = ["postgres", "runtime-tokio", "time"] }
tantivy = { version = "0.24.0", optional = true } tantivy = { version = "0.24.1", optional = true }
thiserror = "2.0.0" thiserror = "2.0.12"
tokio = "1.26.0" tokio = "1.44.2"
tower-http = { version = "0.6.2", features = ["trace"] } tower-http = { version = "0.6.2", features = ["trace"] }
tracing = "0.1.41" tracing = "0.1.41"
url = "2.5.2" url = "2.5.4"
urlencoding = "2.1.3" urlencoding = "2.1.3"
#xtracing = { git = "http://git-private.h.xinu.tv/wathiede/xtracing.git" } #xtracing = { git = "http://git-private.h.xinu.tv/wathiede/xtracing.git" }
#xtracing = { path = "../../xtracing" } #xtracing = { path = "../../xtracing" }
xtracing = { version = "0.3.0", registry = "xinu" } xtracing = { version = "0.3.2", registry = "xinu" }
[build-dependencies] [build-dependencies]
build-info-build = "0.0.40" build-info-build = "0.0.41"
[features] [features]
#default = [ "tantivy" ] #default = [ "tantivy" ]

View File

@@ -0,0 +1,3 @@
DROP TABLE IF NOT EXISTS email_rule;
-- Add down migration script here

View File

@@ -0,0 +1,5 @@
CREATE TABLE IF NOT EXISTS email_rule (
id integer NOT NULL GENERATED ALWAYS AS IDENTITY,
sort_order integer NOT NULL,
rule jsonb NOT NULL
);

View File

@@ -0,0 +1,2 @@
-- Add down migration script here
ALTER TABLE feed DROP COLUMN IF EXISTS disabled;

View File

@@ -0,0 +1,2 @@
-- Add up migration script here
ALTER TABLE feed ADD disabled boolean;

View File

@@ -21,7 +21,7 @@ use letterbox_notmuch::Notmuch;
use letterbox_server::tantivy::TantivyConnection; use letterbox_server::tantivy::TantivyConnection;
use letterbox_server::{ use letterbox_server::{
graphql::{compute_catchup_ids, Attachment, MutationRoot, QueryRoot, SubscriptionRoot}, graphql::{compute_catchup_ids, Attachment, MutationRoot, QueryRoot, SubscriptionRoot},
nm::{attachment_bytes, cid_attachment_bytes}, nm::{attachment_bytes, cid_attachment_bytes, label_unprocessed},
ws::ConnectionTracker, ws::ConnectionTracker,
}; };
use letterbox_shared::WebsocketMessage; use letterbox_shared::WebsocketMessage;
@@ -29,9 +29,9 @@ use serde::Deserialize;
use sqlx::postgres::PgPool; use sqlx::postgres::PgPool;
use tokio::{net::TcpListener, sync::Mutex}; use tokio::{net::TcpListener, sync::Mutex};
use tower_http::trace::{DefaultMakeSpan, TraceLayer}; use tower_http::trace::{DefaultMakeSpan, TraceLayer};
use tracing::info; use tracing::{error, info};
// Make our own error that wraps `anyhow::Error`. // Make our own error that wraps `ServerError`.
struct AppError(letterbox_server::ServerError); struct AppError(letterbox_server::ServerError);
// Tell axum how to convert `AppError` into a response. // Tell axum how to convert `AppError` into a response.
@@ -142,6 +142,17 @@ async fn view_cid(
Ok(inline_attachment_response(attachment)) Ok(inline_attachment_response(attachment))
} }
// TODO make this work with gitea message ids like `wathiede/letterbox/pulls/91@git.z.xinu.tv`
async fn view_original(
State(AppState { nm, .. }): State<AppState>,
extract::Path(id): extract::Path<String>,
) -> Result<impl IntoResponse, AppError> {
info!("view_original {id}");
let bytes = nm.show_original(&id)?;
let s = String::from_utf8_lossy(&bytes).to_string();
Ok(s.into_response())
}
async fn graphiql() -> impl IntoResponse { async fn graphiql() -> impl IntoResponse {
response::Html( response::Html(
GraphiQLSource::build() GraphiQLSource::build()
@@ -158,17 +169,22 @@ async fn start_ws(
connection_tracker, .. connection_tracker, ..
}): State<AppState>, }): State<AppState>,
) -> impl IntoResponse { ) -> impl IntoResponse {
info!("intiating websocket connection for {addr}");
ws.on_upgrade(async move |socket| connection_tracker.lock().await.add_peer(socket, addr).await) ws.on_upgrade(async move |socket| connection_tracker.lock().await.add_peer(socket, addr).await)
} }
#[derive(Debug, Deserialize)] #[derive(Debug, Deserialize)]
struct NotificationParams { struct NotificationParams {
delay_ms: Option<u64>, delay_ms: Option<u64>,
num_unprocessed: Option<usize>,
} }
async fn send_refresh_websocket_handler( async fn send_refresh_websocket_handler(
State(AppState { State(AppState {
connection_tracker, .. nm,
pool,
connection_tracker,
..
}): State<AppState>, }): State<AppState>,
params: Query<NotificationParams>, params: Query<NotificationParams>,
) -> impl IntoResponse { ) -> impl IntoResponse {
@@ -178,12 +194,27 @@ async fn send_refresh_websocket_handler(
info!("sleeping {delay:?}"); info!("sleeping {delay:?}");
tokio::time::sleep(delay).await; tokio::time::sleep(delay).await;
} }
let limit = match params.num_unprocessed {
Some(0) => None,
Some(limit) => Some(limit),
None => Some(10),
};
let mut ids = None;
match label_unprocessed(&nm, &pool, false, limit, "tag:unprocessed").await {
Ok(i) => ids = Some(i),
Err(err) => error!("Failed to label_unprocessed: {err:?}"),
};
connection_tracker connection_tracker
.lock() .lock()
.await .await
.send_message_all(WebsocketMessage::RefreshMessages) .send_message_all(WebsocketMessage::RefreshMessages)
.await; .await;
"refresh triggered" if let Some(ids) = ids {
format!("{ids:?}")
} else {
"refresh triggered".to_string()
}
} }
async fn watch_new( async fn watch_new(
@@ -192,18 +223,33 @@ async fn watch_new(
conn_tracker: Arc<Mutex<ConnectionTracker>>, conn_tracker: Arc<Mutex<ConnectionTracker>>,
poll_time: Duration, poll_time: Duration,
) -> Result<(), async_graphql::Error> { ) -> Result<(), async_graphql::Error> {
let mut old_ids = Vec::new(); async fn watch_new_iteration(
loop { nm: &Notmuch,
pool: &PgPool,
conn_tracker: Arc<Mutex<ConnectionTracker>>,
old_ids: &[String],
) -> Result<Vec<String>, async_graphql::Error> {
let ids = compute_catchup_ids(&nm, &pool, "is:unread").await?; let ids = compute_catchup_ids(&nm, &pool, "is:unread").await?;
info!("old_ids: {} ids: {}", old_ids.len(), ids.len());
if old_ids != ids { if old_ids != ids {
info!("old_ids: {old_ids:?}\n ids: {ids:?}"); label_unprocessed(&nm, &pool, false, Some(100), "tag:unprocessed").await?;
conn_tracker conn_tracker
.lock() .lock()
.await .await
.send_message_all(WebsocketMessage::RefreshMessages) .send_message_all(WebsocketMessage::RefreshMessages)
.await .await
} }
old_ids = ids; Ok(ids)
}
let mut old_ids = Vec::new();
loop {
old_ids = match watch_new_iteration(&nm, &pool, conn_tracker.clone(), &old_ids).await {
Ok(old_ids) => old_ids,
Err(err) => {
error!("watch_new_iteration failed: {err:?}");
continue;
}
};
tokio::time::sleep(poll_time).await; tokio::time::sleep(poll_time).await;
} }
} }
@@ -211,6 +257,7 @@ async fn watch_new(
#[derive(Clone)] #[derive(Clone)]
struct AppState { struct AppState {
nm: Notmuch, nm: Notmuch,
pool: PgPool,
connection_tracker: Arc<Mutex<ConnectionTracker>>, connection_tracker: Arc<Mutex<ConnectionTracker>>,
} }
@@ -251,7 +298,7 @@ async fn main() -> Result<(), Box<dyn Error>> {
let connection_tracker = Arc::new(Mutex::new(ConnectionTracker::default())); let connection_tracker = Arc::new(Mutex::new(ConnectionTracker::default()));
let ct = Arc::clone(&connection_tracker); let ct = Arc::clone(&connection_tracker);
let poll_time = Duration::from_secs(60); let poll_time = Duration::from_secs(60);
let _h = tokio::spawn(watch_new(nm.clone(), pool, ct, poll_time)); let _h = tokio::spawn(watch_new(nm.clone(), pool.clone(), ct, poll_time));
let api_routes = Router::new() let api_routes = Router::new()
.route( .route(
@@ -259,6 +306,7 @@ async fn main() -> Result<(), Box<dyn Error>> {
get(download_attachment), get(download_attachment),
) )
.route("/view/attachment/{id}/{idx}/{*rest}", get(view_attachment)) .route("/view/attachment/{id}/{idx}/{*rest}", get(view_attachment))
.route("/original/{id}", get(view_original))
.route("/cid/{id}/{cid}", get(view_cid)) .route("/cid/{id}/{cid}", get(view_cid))
.route("/ws", any(start_ws)) .route("/ws", any(start_ws))
.route_service("/graphql/ws", GraphQLSubscription::new(schema.clone())) .route_service("/graphql/ws", GraphQLSubscription::new(schema.clone()))
@@ -275,6 +323,7 @@ async fn main() -> Result<(), Box<dyn Error>> {
.nest("/notification", notification_routes) .nest("/notification", notification_routes)
.with_state(AppState { .with_state(AppState {
nm, nm,
pool,
connection_tracker, connection_tracker,
}) })
.layer( .layer(

View File

@@ -0,0 +1,39 @@
use std::error::Error;
use clap::Parser;
use letterbox_notmuch::Notmuch;
use letterbox_server::nm::label_unprocessed;
use sqlx::postgres::PgPool;
use tracing::info;
#[derive(Parser)]
#[command(version, about, long_about = None)]
struct Cli {
#[arg(short, long)]
newsreader_database_url: String,
#[arg(short, long, default_value = "10")]
/// Set to 0 to process all matches
messages_to_process: usize,
#[arg(short, long, default_value = "false")]
execute: bool,
/// Process messages matching this notmuch query
#[arg(short, long, default_value = "tag:unprocessed")]
query: String,
}
#[tokio::main]
async fn main() -> Result<(), Box<dyn Error>> {
let cli = Cli::parse();
let _guard = xtracing::init(env!("CARGO_BIN_NAME"))?;
build_info::build_info!(fn bi);
info!("Build Info: {}", letterbox_shared::build_version(bi));
let pool = PgPool::connect(&cli.newsreader_database_url).await?;
let nm = Notmuch::default();
let limit = if cli.messages_to_process > 0 {
Some(cli.messages_to_process)
} else {
None
};
label_unprocessed(&nm, &pool, !cli.execute, limit, &cli.query).await?;
Ok(())
}

View File

@@ -9,15 +9,14 @@ use async_graphql::{
use cacher::FilesystemCacher; use cacher::FilesystemCacher;
use futures::stream; use futures::stream;
use letterbox_notmuch::Notmuch; use letterbox_notmuch::Notmuch;
use log::info;
use serde::{Deserialize, Serialize}; use serde::{Deserialize, Serialize};
use sqlx::postgres::PgPool; use sqlx::postgres::PgPool;
use tokio::join; use tokio::join;
use tracing::instrument; use tracing::{info, instrument};
#[cfg(feature = "tantivy")] #[cfg(feature = "tantivy")]
use crate::tantivy::TantivyConnection; use crate::tantivy::TantivyConnection;
use crate::{newsreader, nm, Query}; use crate::{newsreader, nm, nm::label_unprocessed, Query};
/// # Number of seconds since the Epoch /// # Number of seconds since the Epoch
pub type UnixTime = isize; pub type UnixTime = isize;
@@ -629,6 +628,10 @@ impl MutationRoot {
let pool = ctx.data_unchecked::<PgPool>(); let pool = ctx.data_unchecked::<PgPool>();
info!("{}", String::from_utf8_lossy(&nm.new()?)); info!("{}", String::from_utf8_lossy(&nm.new()?));
newsreader::refresh(pool, cacher).await?; newsreader::refresh(pool, cacher).await?;
// Process email labels
label_unprocessed(&nm, &pool, false, Some(10), "tag:unprocessed").await?;
#[cfg(feature = "tantivy")] #[cfg(feature = "tantivy")]
{ {
let tantivy = ctx.data_unchecked::<TantivyConnection>(); let tantivy = ctx.data_unchecked::<TantivyConnection>();

View File

@@ -21,7 +21,6 @@ use cacher::{Cacher, FilesystemCacher};
use css_inline::{CSSInliner, InlineError, InlineOptions}; use css_inline::{CSSInliner, InlineError, InlineOptions};
pub use error::ServerError; pub use error::ServerError;
use linkify::{LinkFinder, LinkKind}; use linkify::{LinkFinder, LinkKind};
use log::{debug, error, info, warn};
use lol_html::{ use lol_html::{
element, errors::RewritingError, html_content::ContentType, rewrite_str, text, element, errors::RewritingError, html_content::ContentType, rewrite_str, text,
RewriteStrSettings, RewriteStrSettings,
@@ -32,6 +31,7 @@ use reqwest::StatusCode;
use scraper::{Html, Selector}; use scraper::{Html, Selector};
use sqlx::types::time::PrimitiveDateTime; use sqlx::types::time::PrimitiveDateTime;
use thiserror::Error; use thiserror::Error;
use tracing::{debug, error, info, warn};
use url::Url; use url::Url;
use crate::{ use crate::{

View File

@@ -3,11 +3,10 @@ use std::collections::HashMap;
use cacher::FilesystemCacher; use cacher::FilesystemCacher;
use futures::{stream::FuturesUnordered, StreamExt}; use futures::{stream::FuturesUnordered, StreamExt};
use letterbox_shared::compute_color; use letterbox_shared::compute_color;
use log::{error, info};
use maplit::hashmap; use maplit::hashmap;
use scraper::Selector; use scraper::Selector;
use sqlx::postgres::PgPool; use sqlx::postgres::PgPool;
use tracing::instrument; use tracing::{error, info, instrument};
use url::Url; use url::Url;
use crate::{ use crate::{
@@ -353,6 +352,9 @@ fn slurp_contents_selectors() -> HashMap<String, Vec<Selector>> {
"natwelch.com".to_string() => vec![ "natwelch.com".to_string() => vec![
Selector::parse("article div.prose").unwrap(), Selector::parse("article div.prose").unwrap(),
], ],
"seiya.me".to_string() => vec![
Selector::parse("header + div").unwrap(),
],
"rustacean-station.org".to_string() => vec![ "rustacean-station.org".to_string() => vec![
Selector::parse("article").unwrap(), Selector::parse("article").unwrap(),
], ],

View File

@@ -1,12 +1,14 @@
use std::{collections::HashMap, fs::File}; use std::{
collections::{HashMap, HashSet},
fs::File,
};
use letterbox_notmuch::Notmuch; use letterbox_notmuch::Notmuch;
use letterbox_shared::compute_color; use letterbox_shared::{compute_color, Rule};
use log::{error, info, warn};
use mailparse::{parse_content_type, parse_mail, MailHeader, MailHeaderMap, ParsedMail}; use mailparse::{parse_content_type, parse_mail, MailHeader, MailHeaderMap, ParsedMail};
use memmap::MmapOptions; use memmap::MmapOptions;
use sqlx::PgPool; use sqlx::{types::Json, PgPool};
use tracing::instrument; use tracing::{error, info, info_span, instrument, warn};
use crate::{ use crate::{
compute_offset_limit, compute_offset_limit,
@@ -925,3 +927,179 @@ WHERE
.await?; .await?;
Ok(row.map(|r| r.url)) Ok(row.map(|r| r.url))
} }
/*
* grab email_rules table from sql
* For each message with `unprocessed` label
* parse the message
* pass headers for each message through a matcher using email rules
* for each match, add label to message
* if any matches were found, remove unprocessed
* TODO: how to handle inbox label
*/
#[instrument(name="nm::label_unprocessed", skip_all, fields(dryrun=dryrun, limit=?limit, query=%query))]
pub async fn label_unprocessed(
nm: &Notmuch,
pool: &PgPool,
dryrun: bool,
limit: Option<usize>,
query: &str,
) -> Result<Box<[String]>, ServerError> {
use futures::StreamExt;
let ids = nm.message_ids(query)?;
info!(
"Processing {limit:?} of {} messages with '{query}'",
ids.len()
);
let rules: Vec<_> = sqlx::query!(
r#"
SELECT rule as "rule: Json<Rule>"
FROM email_rule
ORDER BY sort_order
"#,
)
.fetch(pool)
.map(|r| r.unwrap().rule.0)
.collect()
.await;
/*
use letterbox_shared::{Match, MatchType};
let rules = vec![Rule {
stop_on_match: false,
matches: vec![Match {
match_type: MatchType::From,
needle: "eftours".to_string(),
}],
tag: "EFTours".to_string(),
}];
*/
info!("Loaded {} rules", rules.len());
let limit = limit.unwrap_or(ids.len());
let limit = limit.min(ids.len());
let ids = &ids[..limit];
let mut add_mutations = HashMap::new();
let mut rm_mutations = HashMap::new();
for id in ids {
let id = format!("id:{id}");
let files = nm.files(&id)?;
// Only process the first file path is multiple files have the same id
let Some(path) = files.iter().next() else {
error!("No files for message-ID {id}");
let t = "Letterbox/Bad";
nm.tag_add(t, &id)?;
let t = "unprocessed";
nm.tag_remove(t, &id)?;
continue;
};
let file = File::open(&path)?;
info!("parsing {path}");
let mmap = unsafe { MmapOptions::new().map(&file)? };
let m = match info_span!("parse_mail", path = path).in_scope(|| parse_mail(&mmap)) {
Ok(m) => m,
Err(err) => {
error!("Failed to parse {path}: {err}");
let t = "Letterbox/Bad";
nm.tag_add(t, &id)?;
let t = "unprocessed";
nm.tag_remove(t, &id)?;
continue;
}
};
let (matched_rule, add_tags) = find_tags(&rules, &m.headers);
if matched_rule {
if dryrun {
info!(
"\nAdd tags: {add_tags:?}\nTo: {} From: {} Subject: {}\n",
m.headers.get_first_value("to").expect("no from header"),
m.headers.get_first_value("from").expect("no from header"),
m.headers
.get_first_value("subject")
.expect("no subject header")
);
}
for t in &add_tags {
//nm.tag_add(t, &id)?;
add_mutations
.entry(t.to_string())
.or_insert_with(|| Vec::new())
.push(id.clone());
}
if add_tags.contains("spam") || add_tags.contains("Spam") {
//nm.tag_remove("unread", &id)?;
let t = "unread".to_string();
rm_mutations
.entry(t)
.or_insert_with(|| Vec::new())
.push(id.clone());
}
if !add_tags.contains("inbox") {
//nm.tag_remove("inbox", &id)?;
let t = "inbox".to_string();
rm_mutations
.entry(t)
.or_insert_with(|| Vec::new())
.push(id.clone());
}
//nm.tag_remove("unprocessed", &id)?;
} else {
if add_tags.is_empty() {
let t = "Grey".to_string();
add_mutations
.entry(t)
.or_insert_with(|| Vec::new())
.push(id.clone());
}
//nm.tag_remove("inbox", &id)?;
let t = "inbox".to_string();
rm_mutations
.entry(t)
.or_insert_with(|| Vec::new())
.push(id.clone());
}
let t = "unprocessed".to_string();
rm_mutations
.entry(t)
.or_insert_with(|| Vec::new())
.push(id.clone());
}
info!("Adding {} distinct labels", add_mutations.len());
for (tag, ids) in add_mutations.iter() {
info!(" {tag}: {}", ids.len());
if !dryrun {
let ids: Vec<_> = ids.iter().map(|s| s.as_str()).collect();
info_span!("tags_add", tag = tag, count = ids.len())
.in_scope(|| nm.tags_add(tag, &ids))?;
}
}
info!("Removing {} distinct labels", rm_mutations.len());
for (tag, ids) in rm_mutations.iter() {
info!(" {tag}: {}", ids.len());
if !dryrun {
let ids: Vec<_> = ids.iter().map(|s| s.as_str()).collect();
info_span!("tags_remove", tag = tag, count = ids.len())
.in_scope(|| nm.tags_remove(tag, &ids))?;
}
}
Ok(ids.into())
}
fn find_tags<'a, 'b>(rules: &'a [Rule], headers: &'b [MailHeader]) -> (bool, HashSet<&'a str>) {
let mut matched_rule = false;
let mut add_tags = HashSet::new();
for rule in rules {
for hdr in headers {
if rule.is_match(&hdr.get_key(), &hdr.get_value()) {
//info!("Matched {rule:?}");
matched_rule = true;
add_tags.insert(rule.tag.as_str());
if rule.stop_on_match {
return (true, add_tags);
}
}
}
}
return (matched_rule, add_tags);
}

View File

@@ -12,6 +12,9 @@ version.workspace = true
[dependencies] [dependencies]
build-info = "0.0.40" build-info = "0.0.40"
letterbox-notmuch = { version = "0.15.5", path = "../notmuch", registry = "xinu" } letterbox-notmuch = { path = "../notmuch", version = "0.17.26", registry = "xinu" }
serde = { version = "1.0.147", features = ["derive"] } regex = "1.11.1"
serde = { version = "1.0.219", features = ["derive"] }
sqlx = "0.8.5"
strum_macros = "0.27.1" strum_macros = "0.27.1"
tracing = "0.1.41"

View File

@@ -1,8 +1,14 @@
use std::hash::{DefaultHasher, Hash, Hasher}; use std::{
convert::Infallible,
hash::{DefaultHasher, Hash, Hasher},
str::FromStr,
};
use build_info::{BuildInfo, VersionControl}; use build_info::{BuildInfo, VersionControl};
use letterbox_notmuch::SearchSummary; use letterbox_notmuch::SearchSummary;
use regex::{RegexBuilder, RegexSetBuilder};
use serde::{Deserialize, Serialize}; use serde::{Deserialize, Serialize};
use tracing::debug;
#[derive(Serialize, Deserialize, Debug)] #[derive(Serialize, Deserialize, Debug)]
pub struct SearchResult { pub struct SearchResult {
@@ -20,6 +26,13 @@ pub enum WebsocketMessage {
pub mod urls { pub mod urls {
pub const MOUNT_POINT: &'static str = "/api"; pub const MOUNT_POINT: &'static str = "/api";
pub fn view_original(host: Option<&str>, id: &str) -> String {
if let Some(host) = host {
format!("//{host}/api/original/{id}")
} else {
format!("/api/original/{id}")
}
}
pub fn cid_prefix(host: Option<&str>, cid: &str) -> String { pub fn cid_prefix(host: Option<&str>, cid: &str) -> String {
if let Some(host) = host { if let Some(host) = host {
format!("//{host}/api/cid/{cid}/") format!("//{host}/api/cid/{cid}/")
@@ -58,3 +71,198 @@ pub fn compute_color(data: &str) -> String {
data.hash(&mut hasher); data.hash(&mut hasher);
format!("#{:06x}", hasher.finish() % (1 << 24)) format!("#{:06x}", hasher.finish() % (1 << 24))
} }
#[derive(
Copy, Clone, Debug, Default, PartialEq, Eq, Hash, Ord, PartialOrd, Serialize, Deserialize,
)]
pub enum MatchType {
From,
Sender,
To,
Cc,
Subject,
ListId,
DeliveredTo,
XForwardedTo,
ReplyTo,
XOriginalTo,
XSpam,
Body,
#[default]
Unknown,
}
#[derive(Debug, Default, Serialize, Deserialize)]
pub struct Match {
pub match_type: MatchType,
pub needle: String,
}
#[derive(Debug, Default, Serialize, Deserialize)]
pub struct Rule {
pub stop_on_match: bool,
pub matches: Vec<Match>,
pub tag: String,
}
impl Rule {
pub fn is_match(&self, header_key: &str, header_value: &str) -> bool {
let pats: Vec<_> = self
.matches
.iter()
.filter_map(|m| match m.match_type {
MatchType::To => Some("^(to|cc|bcc|x-original-to)$"),
MatchType::From => Some("^from$"),
MatchType::Sender => Some("^sender$"),
MatchType::Subject => Some("^subject$"),
MatchType::ListId => Some("^list-id$"),
MatchType::XOriginalTo => Some("^x-original-to$"),
MatchType::ReplyTo => Some("^reply-to$"),
MatchType::XSpam => Some("^x-spam$"),
MatchType::Body => None,
c => panic!("TODO handle '{c:?}' match type"),
})
.collect();
let set = RegexSetBuilder::new(&pats)
.case_insensitive(true)
.build()
.expect("failed to compile regex for matches");
let matches: Vec<_> = set.matches(header_key).into_iter().collect();
if !matches.is_empty() {
//info!("matched key '{header_key}' '{header_value}'");
for m_idx in matches {
let needle = regex::escape(&self.matches[m_idx].needle);
let pat = RegexBuilder::new(&needle)
.case_insensitive(true)
.build()
.expect("failed to compile regex for needle");
if pat.is_match(header_value) {
debug!("{header_key} matched {header_value} against {needle}");
return true;
}
}
}
false
}
}
mod matches {
// From https://linux.die.net/man/5/procmailrc
// If the regular expression contains '^TO_' it will be substituted by '(^((Original-)?(Resent-)?(To|Cc|Bcc)|(X-Envelope |Apparently(-Resent)?)-To):(.*[^-a-zA-Z0-9_.])?)'
// If the regular expression contains '^TO' it will be substituted by '(^((Original-)?(Resent-)?(To|Cc|Bcc)|(X-Envelope |Apparently(-Resent)?)-To):(.*[^a-zA-Z])?)', which should catch all destination specifications containing a specific word.
pub const TO: &'static str = "TO";
pub const CC: &'static str = "Cc";
pub const TOCC: &'static str = "(TO|Cc)";
pub const FROM: &'static str = "From";
pub const SENDER: &'static str = "Sender";
pub const SUBJECT: &'static str = "Subject";
pub const DELIVERED_TO: &'static str = "Delivered-To";
pub const X_FORWARDED_TO: &'static str = "X-Forwarded-To";
pub const REPLY_TO: &'static str = "Reply-To";
pub const X_ORIGINAL_TO: &'static str = "X-Original-To";
pub const LIST_ID: &'static str = "List-ID";
pub const X_SPAM: &'static str = "X-Spam";
pub const X_SPAM_FLAG: &'static str = "X-Spam-Flag";
}
impl FromStr for Match {
type Err = Infallible;
fn from_str(s: &str) -> Result<Self, Self::Err> {
// Examples:
// "* 1^0 ^TOsonyrewards.com@xinu.tv"
// "* ^TOsonyrewards.com@xinu.tv"
let mut it = s.split_whitespace().skip(1);
let mut needle = it.next().unwrap();
if needle == "1^0" {
needle = it.next().unwrap();
}
let mut needle = vec![needle];
needle.extend(it);
let needle = needle.join(" ");
let first = needle.chars().nth(0).unwrap_or(' ');
use matches::*;
if first == '^' {
let needle = &needle[1..];
if needle.starts_with(TO) {
return Ok(Match {
match_type: MatchType::To,
needle: cleanup_match(TO, needle),
});
} else if needle.starts_with(FROM) {
return Ok(Match {
match_type: MatchType::From,
needle: cleanup_match(FROM, needle),
});
} else if needle.starts_with(CC) {
return Ok(Match {
match_type: MatchType::Cc,
needle: cleanup_match(CC, needle),
});
} else if needle.starts_with(TOCC) {
return Ok(Match {
match_type: MatchType::To,
needle: cleanup_match(TOCC, needle),
});
} else if needle.starts_with(SENDER) {
return Ok(Match {
match_type: MatchType::Sender,
needle: cleanup_match(SENDER, needle),
});
} else if needle.starts_with(SUBJECT) {
return Ok(Match {
match_type: MatchType::Subject,
needle: cleanup_match(SUBJECT, needle),
});
} else if needle.starts_with(X_ORIGINAL_TO) {
return Ok(Match {
match_type: MatchType::XOriginalTo,
needle: cleanup_match(X_ORIGINAL_TO, needle),
});
} else if needle.starts_with(LIST_ID) {
return Ok(Match {
match_type: MatchType::ListId,
needle: cleanup_match(LIST_ID, needle),
});
} else if needle.starts_with(REPLY_TO) {
return Ok(Match {
match_type: MatchType::ReplyTo,
needle: cleanup_match(REPLY_TO, needle),
});
} else if needle.starts_with(X_SPAM_FLAG) {
return Ok(Match {
match_type: MatchType::XSpam,
needle: '*'.to_string(),
});
} else if needle.starts_with(X_SPAM) {
return Ok(Match {
match_type: MatchType::XSpam,
needle: '*'.to_string(),
});
} else if needle.starts_with(DELIVERED_TO) {
return Ok(Match {
match_type: MatchType::DeliveredTo,
needle: cleanup_match(DELIVERED_TO, needle),
});
} else if needle.starts_with(X_FORWARDED_TO) {
return Ok(Match {
match_type: MatchType::XForwardedTo,
needle: cleanup_match(X_FORWARDED_TO, needle),
});
} else {
unreachable!("needle: '{needle}'")
}
} else {
return Ok(Match {
match_type: MatchType::Body,
needle: cleanup_match("", &needle),
});
}
}
}
fn unescape(s: &str) -> String {
s.replace('\\', "")
}
pub fn cleanup_match(prefix: &str, s: &str) -> String {
unescape(&s[prefix.len()..]).replace(".*", "")
}

View File

@@ -12,30 +12,29 @@ version.workspace = true
build-info-build = "0.0.40" build-info-build = "0.0.40"
[dev-dependencies] [dev-dependencies]
wasm-bindgen-test = "0.3.33" wasm-bindgen-test = "0.3.50"
[dependencies] [dependencies]
console_error_panic_hook = "0.1.7" console_error_panic_hook = "0.1.7"
log = "0.4.17" log = "0.4.27"
seed = { version = "0.10.0", features = ["routing"] } seed = { version = "0.10.0", features = ["routing"] }
#seed = "0.9.2" #seed = "0.9.2"
console_log = { version = "0.1.0", registry = "xinu" } console_log = { version = "0.1.4", registry = "xinu" }
serde = { version = "1.0.147", features = ["derive"] } serde = { version = "1.0.219", features = ["derive"] }
itertools = "0.14.0" itertools = "0.14.0"
serde_json = { version = "1.0.93", features = ["unbounded_depth"] } serde_json = { version = "1.0.140", features = ["unbounded_depth"] }
chrono = "0.4.31" chrono = "0.4.40"
graphql_client = "0.14.0" graphql_client = "0.14.0"
thiserror = "2.0.0" thiserror = "2.0.12"
gloo-net = { version = "0.6.0", features = ["json", "serde_json"] } gloo-net = { version = "0.6.0", features = ["json", "serde_json"] }
human_format = "1.1.0" human_format = "1.1.0"
build-info = "0.0.40" build-info = "0.0.40"
wasm-bindgen = "=0.2.100" wasm-bindgen = "=0.2.100"
uuid = { version = "1.13.1", features = [ uuid = { version = "1.16.0", features = [
"js", "js",
] } # direct dep to set js feature, prevents Rng issues ] } # direct dep to set js feature, prevents Rng issues
letterbox-shared = { version = "0.15.5", path = "../shared", registry = "xinu" } letterbox-shared = { version = "0.17.9", registry = "xinu" }
letterbox-notmuch = { version = "0.15.5", path = "../notmuch", registry = "xinu" } seed_hooks = { version = "0.4.1", registry = "xinu" }
seed_hooks = { version = "0.4.0", registry = "xinu" }
strum_macros = "0.27.1" strum_macros = "0.27.1"
gloo-console = "0.3.0" gloo-console = "0.3.0"
[target.'cfg(target_arch = "wasm32")'.dependencies] [target.'cfg(target_arch = "wasm32")'.dependencies]
@@ -45,14 +44,15 @@ wasm-sockets = "1.0.0"
wasm-opt = ['-Os'] wasm-opt = ['-Os']
[dependencies.web-sys] [dependencies.web-sys]
version = "0.3.58" version = "0.3.77"
features = [ features = [
"Clipboard", "Clipboard",
"DomRect", "DomRect",
"Element", "Element",
"History",
"MediaQueryList", "MediaQueryList",
"Navigator", "Navigator",
"Window", "Performance",
"History",
"ScrollRestoration", "ScrollRestoration",
"Window",
] ]

View File

@@ -2,8 +2,6 @@
// - it's useful when you want to check your code with `cargo make verify` // - it's useful when you want to check your code with `cargo make verify`
// but some rules are too "annoying" or are not applicable for your case.) // but some rules are too "annoying" or are not applicable for your case.)
#![allow(clippy::wildcard_imports)] #![allow(clippy::wildcard_imports)]
// Until https://github.com/rust-lang/rust/issues/138762 is addressed in dependencies
#![allow(wasm_c_abi)]
use log::Level; use log::Level;
use seed::App; use seed::App;

View File

@@ -72,10 +72,6 @@ fn on_url_changed(old: &Url, mut new: Url) -> Msg {
if did_change { if did_change {
messages.push(Msg::ScrollToTop) messages.push(Msg::ScrollToTop)
} }
info!(
"url changed\nold '{old}'\nnew '{new}', history {}",
history().length().unwrap_or(0)
);
let hpp = new.remaining_hash_path_parts(); let hpp = new.remaining_hash_path_parts();
let msg = match hpp.as_slice() { let msg = match hpp.as_slice() {
["t", tid] => Msg::ShowThreadRequest { ["t", tid] => Msg::ShowThreadRequest {
@@ -553,7 +549,6 @@ pub fn update(msg: Msg, model: &mut Model, orders: &mut impl Orders<Msg>) {
}); });
} }
Msg::ScrollToTop => { Msg::ScrollToTop => {
info!("scrolling to the top");
web_sys::window().unwrap().scroll_to_with_x_and_y(0., 0.); web_sys::window().unwrap().scroll_to_with_x_and_y(0., 0.);
} }
Msg::WindowScrolled => { Msg::WindowScrolled => {
@@ -619,6 +614,36 @@ pub fn update(msg: Msg, model: &mut Model, orders: &mut impl Orders<Msg>) {
orders.send_msg(Msg::CatchupRequest { query }); orders.send_msg(Msg::CatchupRequest { query });
} }
Msg::CatchupKeepUnread => { Msg::CatchupKeepUnread => {
if let Some(thread_id) = current_thread_id(&model.context) {
if let Context::ThreadResult {
thread:
ShowThreadQueryThread::EmailThread(ShowThreadQueryThreadOnEmailThread {
messages,
..
}),
..
} = &model.context
{
//orders.send_msg(Msg::SetUnread(thread_id, false));
let unread_messages: Vec<_> = messages
.iter()
.filter(|msg| msg.tags.iter().any(|t| t == "unread"))
.map(|msg| &msg.id)
.collect();
if unread_messages.is_empty() {
// All messages are read, so mark them all unread
orders.send_msg(Msg::SetUnread(thread_id, true));
} else {
// Do nothing if there are some messages unread
}
} else {
// News post, not email, just mark unread
orders.send_msg(Msg::SetUnread(thread_id, true));
};
} else {
// This shouldn't happen
warn!("no current thread_id");
}
orders.send_msg(Msg::CatchupNext); orders.send_msg(Msg::CatchupNext);
} }
Msg::CatchupMarkAsRead => { Msg::CatchupMarkAsRead => {

View File

@@ -263,81 +263,108 @@ fn search_results(
} else { } else {
set_title(query); set_title(query);
} }
let rows = results.iter().map(|r| { let rows: Vec<_> = results
let tid = r.thread.clone(); .iter()
let check_tid = r.thread.clone(); .map(|r| {
let datetime = human_age(r.timestamp as i64); let tid = r.thread.clone();
let unread_idx = r.tags.iter().position(|e| e == &"unread"); let check_tid = r.thread.clone();
let mut tags = r.tags.clone(); let datetime = human_age(r.timestamp as i64);
if let Some(idx) = unread_idx { let unread_idx = r.tags.iter().position(|e| e == &"unread");
tags.remove(idx); let mut tags = r.tags.clone();
}; if let Some(idx) = unread_idx {
let is_unread = unread_idx.is_some(); tags.remove(idx);
let mut title_break = None; };
const TITLE_LENGTH_WRAP_LIMIT: usize = 40; let is_unread = unread_idx.is_some();
for w in r.subject.split_whitespace() { let mut title_break = None;
if w.len() > TITLE_LENGTH_WRAP_LIMIT { const TITLE_LENGTH_WRAP_LIMIT: usize = 40;
title_break = Some(C!["break-all", "text-pretty"]); for w in r.subject.split_whitespace() {
if w.len() > TITLE_LENGTH_WRAP_LIMIT {
title_break = Some(C!["break-all", "text-pretty"]);
}
} }
}
div![
C![
"flex",
"flex-nowrap",
"w-auto",
"flex-auto",
"py-4",
"border-b",
"border-neutral-800"
],
div![ div![
C!["flex", "items-center", "mr-4"], C![
input![ "flex",
C![&tw_classes::CHECKBOX], "flex-nowrap",
attrs! { "w-auto",
At::Type=>"checkbox", "flex-auto",
At::Checked=>selected_threads.contains(&tid).as_at_value(), "py-4",
} "border-b",
"border-neutral-800"
], ],
ev(Ev::Input, move |e| {
if let Some(input) = e
.target()
.as_ref()
.expect("failed to get reference to target")
.dyn_ref::<web_sys::HtmlInputElement>()
{
if input.checked() {
Msg::SelectionAddThread(check_tid)
} else {
Msg::SelectionRemoveThread(check_tid)
}
} else {
Msg::Noop
}
}),
],
a![
C!["flex-grow"],
IF!(is_unread => C!["font-bold"]),
attrs! {
At::Href => urls::thread(&tid)
},
div![title_break, &r.subject],
span![C!["text-xs"], pretty_authors(&r.authors)],
div![ div![
C!["flex", "flex-wrap", "justify-between"], C!["flex", "items-center", "mr-4"],
span![tags_chiclet(&tags)], input![
span![C!["text-sm"], datetime] C![&tw_classes::CHECKBOX],
attrs! {
At::Type=>"checkbox",
At::Checked=>selected_threads.contains(&tid).as_at_value(),
}
],
ev(Ev::Input, move |e| {
if let Some(input) = e
.target()
.as_ref()
.expect("failed to get reference to target")
.dyn_ref::<web_sys::HtmlInputElement>()
{
if input.checked() {
Msg::SelectionAddThread(check_tid)
} else {
Msg::SelectionRemoveThread(check_tid)
}
} else {
Msg::Noop
}
}),
],
a![
C!["flex-grow"],
IF!(is_unread => C!["font-bold"]),
attrs! {
At::Href => urls::thread(&tid)
},
div![title_break, &r.subject],
span![C!["text-xs"], pretty_authors(&r.authors)],
div![
C!["flex", "flex-wrap", "justify-between"],
span![tags_chiclet(&tags)],
span![C!["text-sm"], datetime]
]
] ]
] ]
] })
}); .collect();
let show_bulk_edit = !selected_threads.is_empty(); let show_bulk_edit = !selected_threads.is_empty();
let all_selected = selected_threads.len() == results.len(); let all_selected = (selected_threads.len() == results.len()) && !rows.is_empty();
let content = if rows.is_empty() {
let caught_up = query.contains("is:unread");
let read_emoji = ["👻", "👽", "👾", "🤖", "💀"];
let no_results_emoji = ["🙈", "👀", "🤦", "🤷", "🙅", "🛟", "🍩", "🌑", "💿", "🔍"];
// Randomly choose emoji based on what 10-second window we're currently in
let now = seed::window()
.performance()
.map(|p| p.now() as usize / 10_000)
.unwrap_or(0);
let (emoji, text) = if caught_up {
let idx = now % read_emoji.len();
(read_emoji[idx], "All caught up!")
} else {
let idx = now % no_results_emoji.len();
(no_results_emoji[idx], "No results")
};
div![
C!["text-center"],
h1![C!["text-9xl"], emoji],
p![C!["mt-8", "text-3xl", "font-semibold"], text]
]
} else {
div![rows]
};
div![ div![
C!["flex", "flex-col", "flex-auto", "p-4"], C!["flex", "flex-col", "flex-auto", "p-4"],
search_toolbar(count, pager, show_bulk_edit, all_selected), search_toolbar(count, pager, show_bulk_edit, all_selected),
div![rows], content,
search_toolbar(count, pager, show_bulk_edit, all_selected), search_toolbar(count, pager, show_bulk_edit, all_selected),
] ]
} }
@@ -541,13 +568,19 @@ fn search_toolbar(
tw_classes::button(), tw_classes::button(),
IF!(!pager.has_previous_page => attrs!{ At::Disabled=>true }), IF!(!pager.has_previous_page => attrs!{ At::Disabled=>true }),
"<", "<",
IF!(pager.has_previous_page => ev(Ev::Click, |_| Msg::PreviousPage)), IF!(pager.has_previous_page => ev(
Ev::Click, |_| Msg::MultiMsg(vec![
Msg::ScrollToTop,
Msg::PreviousPage]))),
], ],
button![ button![
tw_classes::button(), tw_classes::button(),
IF!(!pager.has_next_page => attrs!{ At::Disabled=>true }), IF!(!pager.has_next_page => attrs!{ At::Disabled=>true }),
">", ">",
IF!(pager.has_next_page => ev(Ev::Click, |_| Msg::NextPage)) IF!(pager.has_next_page => ev(
Ev::Click, |_| Msg::MultiMsg(vec![
Msg::ScrollToTop,
Msg::NextPage])))
] ]
] ]
] ]
@@ -688,6 +721,8 @@ fn render_open_header(msg: &ShowThreadQueryThreadOnEmailThreadMessages) -> Node<
.collect(); .collect();
let show_x_original_to = !*to_xinu.borrow() && msg.x_original_to.is_some(); let show_x_original_to = !*to_xinu.borrow() && msg.x_original_to.is_some();
let show_delivered_to = !*to_xinu.borrow() && !show_x_original_to && msg.delivered_to.is_some(); let show_delivered_to = !*to_xinu.borrow() && !show_x_original_to && msg.delivered_to.is_some();
let host = seed::window().location().host().expect("couldn't get host");
let href = letterbox_shared::urls::view_original(Some(&host), &msg.id);
div![ div![
C!["flex", "p-4", "bg-neutral-800"], C!["flex", "p-4", "bg-neutral-800"],
div![avatar], div![avatar],
@@ -769,20 +804,36 @@ fn render_open_header(msg: &ShowThreadQueryThreadOnEmailThreadMessages) -> Node<
C!["text-right"], C!["text-right"],
msg.timestamp msg.timestamp
.map(|ts| div![C!["text-xs", "text-nowrap"], human_age(ts)]), .map(|ts| div![C!["text-xs", "text-nowrap"], human_age(ts)]),
i![C![ div![
"mx-4", C!["p-2"],
"read-status", i![C![
"far", "mx-4",
if is_unread { "read-status",
"fa-envelope" "far",
} else { if is_unread {
"fa-envelope-open" "fa-envelope"
}, } else {
]], "fa-envelope-open"
ev(Ev::Click, move |e| { },
e.stop_propagation(); ]],
Msg::SetUnread(id, !is_unread) ev(Ev::Click, move |e| {
}), e.stop_propagation();
Msg::SetUnread(id, !is_unread)
}),
],
div![
C!["text-xs"],
span![a![
attrs! {
At::Href=>href,
At::Target=>"_blank",
},
"View original",
ev(Ev::Click, move |e| {
e.stop_propagation();
})
]]
]
] ]
] ]
} }
@@ -925,20 +976,23 @@ fn render_closed_header(msg: &ShowThreadQueryThreadOnEmailThreadMessages) -> Nod
C!["text-right"], C!["text-right"],
msg.timestamp msg.timestamp
.map(|ts| div![C!["text-xs", "text-nowrap"], human_age(ts)]), .map(|ts| div![C!["text-xs", "text-nowrap"], human_age(ts)]),
i![C![ div![
"mx-4", C!["p-2"],
"read-status", i![C![
"far", "mx-4",
if is_unread { "read-status",
"fa-envelope" "far",
} else { if is_unread {
"fa-envelope-open" "fa-envelope"
}, } else {
]], "fa-envelope-open"
ev(Ev::Click, move |e| { },
e.stop_propagation(); ]],
Msg::SetUnread(id, !is_unread) ev(Ev::Click, move |e| {
}), e.stop_propagation();
Msg::SetUnread(id, !is_unread)
})
],
] ]
] ]
} }
@@ -971,7 +1025,7 @@ fn message_render(msg: &ShowThreadQueryThreadOnEmailThreadMessages, open: bool)
], ],
IF!(open => IF!(open =>
div![ div![
C!["bg-white", "text-black", "p-4", "min-w-full", "w-0","overflow-x-auto", from], C!["content", "bg-white", "text-black", "p-4", "min-w-full", "w-0","overflow-x-auto", from],
match &msg.body { match &msg.body {
ShowThreadQueryThreadOnEmailThreadMessagesBody::UnhandledContentType( ShowThreadQueryThreadOnEmailThreadMessagesBody::UnhandledContentType(
ShowThreadQueryThreadOnEmailThreadMessagesBodyOnUnhandledContentType { contents ,content_tree}, ShowThreadQueryThreadOnEmailThreadMessagesBodyOnUnhandledContentType { contents ,content_tree},
@@ -1075,7 +1129,6 @@ fn render_attachements(
] ]
} }
// TODO: add cathup_mode:bool and hide elements when true
#[topo::nested] #[topo::nested]
fn thread( fn thread(
thread: &ShowThreadQueryThreadOnEmailThread, thread: &ShowThreadQueryThreadOnEmailThread,
@@ -1166,13 +1219,7 @@ fn thread(
el_ref(content_el), el_ref(content_el),
messages, messages,
IF!(!catchup_mode => click_to_top()) IF!(!catchup_mode => click_to_top())
], ]
/* TODO(wathiede): plumb in orignal id
a![
attrs! {At::Href=>api::original(&thread_node.0.as_ref().expect("message missing").id)},
"Original"
],
*/
] ]
} }

View File

@@ -1,7 +1,7 @@
use std::{collections::VecDeque, rc::Rc}; use std::{collections::VecDeque, rc::Rc};
use letterbox_shared::WebsocketMessage; use letterbox_shared::WebsocketMessage;
use log::{error, info}; use log::{debug, error};
use seed::prelude::*; use seed::prelude::*;
use serde::{Deserialize, Serialize}; use serde::{Deserialize, Serialize};
#[cfg(not(target_arch = "wasm32"))] #[cfg(not(target_arch = "wasm32"))]
@@ -122,13 +122,13 @@ pub fn update(msg: Msg, model: &mut Model, orders: &mut impl Orders<Msg>) {
match msg { match msg {
Msg::WebSocketOpened => { Msg::WebSocketOpened => {
model.web_socket_reconnector = None; model.web_socket_reconnector = None;
info!("WebSocket connection is open now"); debug!("WebSocket connection is open now");
} }
Msg::TextMessageReceived(msg) => { Msg::TextMessageReceived(msg) => {
model.updates.push_back(msg); model.updates.push_back(msg);
} }
Msg::WebSocketClosed(close_event) => { Msg::WebSocketClosed(close_event) => {
info!( debug!(
r#"================== r#"==================
WebSocket connection was closed: WebSocket connection was closed:
Clean: {0} Clean: {0}
@@ -148,7 +148,7 @@ Reason: {2}
} }
} }
Msg::WebSocketFailed => { Msg::WebSocketFailed => {
info!("WebSocket failed"); debug!("WebSocket failed");
if model.web_socket_reconnector.is_none() { if model.web_socket_reconnector.is_none() {
model.web_socket_reconnector = Some( model.web_socket_reconnector = Some(
orders.stream_with_handle(streams::backoff(None, Msg::ReconnectWebSocket)), orders.stream_with_handle(streams::backoff(None, Msg::ReconnectWebSocket)),
@@ -156,7 +156,7 @@ Reason: {2}
} }
} }
Msg::ReconnectWebSocket(retries) => { Msg::ReconnectWebSocket(retries) => {
info!("Reconnect attempt: {}", retries); debug!("Reconnect attempt: {}", retries);
model.web_socket = create_websocket(&model.ws_url, orders).unwrap(); model.web_socket = create_websocket(&model.ws_url, orders).unwrap();
} }
Msg::SendMessage(msg) => { Msg::SendMessage(msg) => {
@@ -177,16 +177,16 @@ fn create_websocket(url: &str, orders: &impl Orders<Msg>) -> Result<EventClient,
let send = msg_sender.clone(); let send = msg_sender.clone();
client.set_on_connection(Some(Box::new(move |client: &EventClient| { client.set_on_connection(Some(Box::new(move |client: &EventClient| {
info!("{:#?}", client.status); debug!("{:#?}", client.status);
let msg = match *client.status.borrow() { let msg = match *client.status.borrow() {
ConnectionStatus::Connecting => { ConnectionStatus::Connecting => {
info!("Connecting..."); debug!("Connecting...");
None None
} }
ConnectionStatus::Connected => Some(Msg::WebSocketOpened), ConnectionStatus::Connected => Some(Msg::WebSocketOpened),
ConnectionStatus::Error => Some(Msg::WebSocketFailed), ConnectionStatus::Error => Some(Msg::WebSocketFailed),
ConnectionStatus::Disconnected => { ConnectionStatus::Disconnected => {
info!("Disconnected"); debug!("Disconnected");
None None
} }
}; };
@@ -195,7 +195,7 @@ fn create_websocket(url: &str, orders: &impl Orders<Msg>) -> Result<EventClient,
let send = msg_sender.clone(); let send = msg_sender.clone();
client.set_on_close(Some(Box::new(move |ev| { client.set_on_close(Some(Box::new(move |ev| {
info!("WS: Connection closed"); debug!("WS: Connection closed");
send(Some(Msg::WebSocketClosed(ev))); send(Some(Msg::WebSocketClosed(ev)));
}))); })));

View File

@@ -2,23 +2,23 @@ html {
background-color: black; background-color: black;
} }
.mail-thread a, .mail-thread .content a,
.news-post a { .news-post a {
color: var(--color-link) !important; color: var(--color-link) !important;
text-decoration: underline; text-decoration: underline;
} }
.mail-thread br, .mail-thread .content br,
.news-post br { .news-post br {
display: block; display: block;
margin-top: 1em; margin-top: 1em;
content: " "; content: " ";
} }
.mail-thread h1, .mail-thread .content h1,
.mail-thread h2, .mail-thread .content h2,
.mail-thread h3, .mail-thread .content h3,
.mail-thread h4, .mail-thread .content h4,
.news-post h1, .news-post h1,
.news-post h2, .news-post h2,
.news-post h3, .news-post h3,
@@ -27,12 +27,12 @@ html {
margin-bottom: 1em !important; margin-bottom: 1em !important;
} }
.mail-thread p, .mail-thread .content p,
.news-post p { .news-post p {
margin-bottom: 1em; margin-bottom: 1em;
} }
.mail-thread pre, .mail-thread .content pre,
.news-post pre { .news-post pre {
font-family: monospace; font-family: monospace;
background-color: #eee !important; background-color: #eee !important;
@@ -40,28 +40,28 @@ html {
white-space: break-spaces; white-space: break-spaces;
} }
.mail-thread code, .mail-thread .content code,
.news-post code { .news-post code {
font-family: monospace; font-family: monospace;
white-space: break-spaces; white-space: break-spaces;
background-color: #eee !important; background-color: #eee !important;
} }
.mail-thread blockquote { .mail-thread .content blockquote {
padding-left: 1em; padding-left: 1em;
border-left: 2px solid #ddd; border-left: 2px solid #ddd;
} }
.mail-thread ol, .mail-thread .content ol,
.mail-thread ul { .mail-thread .content ul {
margin-left: 2em; margin-left: 2em;
} }
.mail-thread .noreply-news-bloomberg-com a { .mail-thread .content .noreply-news-bloomberg-com a {
background-color: initial !important; background-color: initial !important;
} }
.mail-thread .noreply-news-bloomberg-com h2 { .mail-thread .content .noreply-news-bloomberg-com h2 {
margin: 0 !important; margin: 0 !important;
padding: 0 !important; padding: 0 !important;
} }
@@ -76,6 +76,11 @@ html {
display: none !important; display: none !important;
} }
.news-post.site-seiya-me figure>pre,
.news-post.site-seiya-me figure>pre>code {
background-color: black !important;
}
.news-post.site-slashdot .story-byline { .news-post.site-slashdot .story-byline {
display: block !important; display: block !important;
height: initial !important; height: initial !important;