Compare commits

...

1099 Commits

Author SHA1 Message Date
89cb1e4e75 chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 2m8s
Continuous integration / Test Suite (push) Successful in 2m10s
Continuous integration / Trunk (push) Successful in 1m18s
Continuous integration / Rustfmt (push) Successful in 52s
Continuous integration / build (push) Successful in 2m46s
Continuous integration / Disallow unused dependencies (push) Successful in 2m30s
2025-09-02 19:27:38 -07:00
46021c5d2c server: remove dupe and move code to more idomatic layout 2025-09-02 19:27:14 -07:00
4243f7b77d chore: Release 2025-09-02 19:25:24 -07:00
06e65a52b3 server: most test to end of file
Some checks failed
Continuous integration / Check (push) Has been cancelled
Continuous integration / Test Suite (push) Has been cancelled
Continuous integration / Trunk (push) Has been cancelled
Continuous integration / Rustfmt (push) Has been cancelled
Continuous integration / build (push) Has been cancelled
Continuous integration / Disallow unused dependencies (push) Has been cancelled
2025-09-02 19:24:34 -07:00
f3c5b4eb8c server: fix issues from latest renovatebot 2025-09-02 19:23:26 -07:00
184ac3011d fix(deps): update all non-major dependencies 2025-09-02 19:23:26 -07:00
fba27ab7c4 server: add new test file 2025-09-02 19:23:26 -07:00
4526d99de9 server: add another test and tweak fallback extraction logic 2025-09-02 19:23:26 -07:00
6ff9b2cd54 server: address lint 2025-09-02 19:23:26 -07:00
3311f2fc00 server: render text extract calendar info w/ ics template 2025-09-02 19:23:26 -07:00
3fd41062d7 server: work in progress to improve calendar extraction 2025-09-02 19:23:26 -07:00
2f0a3f50b8 server: move tests to the bottom of the file 2025-09-02 19:23:25 -07:00
caf924203e server: fix some recurring parsing/viz 2025-09-02 19:23:25 -07:00
7b7f012b19 server: add new calendar parser test 2025-09-02 19:23:25 -07:00
710e440fbf Merge pull request 'chore(deps): update dependency font-awesome to v7' (#135) from renovate/font-awesome-7.x into master
All checks were successful
Continuous integration / Check (push) Successful in 1m8s
Continuous integration / Test Suite (push) Successful in 1m50s
Continuous integration / Trunk (push) Successful in 7m19s
Continuous integration / Rustfmt (push) Successful in 45s
Continuous integration / build (push) Successful in 2m22s
Continuous integration / Disallow unused dependencies (push) Successful in 2m18s
Reviewed-on: #135
2025-09-02 19:01:38 -07:00
af48dff922 chore(deps): update dependency font-awesome to v7
All checks were successful
Continuous integration / Check (push) Successful in 1m18s
Continuous integration / Test Suite (push) Successful in 2m37s
Continuous integration / Trunk (push) Successful in 1m1s
Continuous integration / Rustfmt (push) Successful in 40s
Continuous integration / build (push) Successful in 3m11s
Continuous integration / Disallow unused dependencies (push) Successful in 2m14s
2025-09-02 22:00:56 +00:00
cbe7dbed96 Merge pull request 'chore(deps): lock file maintenance' (#150) from renovate/lock-file-maintenance into master
All checks were successful
Continuous integration / Check (push) Successful in 59s
Continuous integration / Test Suite (push) Successful in 1m24s
Continuous integration / Trunk (push) Successful in 56s
Continuous integration / Rustfmt (push) Successful in 38s
Continuous integration / build (push) Successful in 1m31s
Continuous integration / Disallow unused dependencies (push) Successful in 2m12s
2025-08-21 19:00:55 -07:00
6b011e0ffa chore(deps): lock file maintenance
All checks were successful
Continuous integration / Check (push) Successful in 2m0s
Continuous integration / Test Suite (push) Successful in 4m12s
Continuous integration / Trunk (push) Successful in 7m13s
Continuous integration / Rustfmt (push) Successful in 40s
Continuous integration / build (push) Successful in 4m6s
Continuous integration / Disallow unused dependencies (push) Successful in 2m28s
2025-08-22 00:32:11 +00:00
ab1862db2d chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 51s
Continuous integration / Test Suite (push) Successful in 1m7s
Continuous integration / Trunk (push) Successful in 46s
Continuous integration / Rustfmt (push) Successful in 39s
Continuous integration / build (push) Successful in 1m35s
Continuous integration / Disallow unused dependencies (push) Successful in 2m16s
2025-08-21 17:15:42 -07:00
0afa6da3f2 server: remove unused icalendar dep 2025-08-21 17:15:23 -07:00
f85649dadd cargo fmt 2025-08-21 17:09:57 -07:00
0140fa5efe chore: Release
Some checks failed
Continuous integration / Check (push) Successful in 52s
Continuous integration / Test Suite (push) Successful in 1m9s
Continuous integration / Trunk (push) Successful in 58s
Continuous integration / Rustfmt (push) Failing after 38s
Continuous integration / build (push) Successful in 1m36s
Continuous integration / Disallow unused dependencies (push) Failing after 2m16s
2025-08-21 16:47:03 -07:00
832b322b77 web: much more compact read mail headers 2025-08-21 16:45:35 -07:00
66dbcf2cfd server: style tweak for tls_report summary 2025-08-21 11:21:10 -07:00
54dc45660a chore: Release
Some checks failed
Continuous integration / Check (push) Successful in 1m8s
Continuous integration / Test Suite (push) Successful in 1m39s
Continuous integration / Trunk (push) Successful in 55s
Continuous integration / Rustfmt (push) Failing after 39s
Continuous integration / build (push) Successful in 2m12s
Continuous integration / Disallow unused dependencies (push) Failing after 2m13s
2025-08-19 17:09:03 -07:00
3827f87111 server: address lint 2025-08-19 17:08:53 -07:00
25839328ac server: style changes for start/end datetime on calendar widget 2025-08-19 17:03:45 -07:00
b2c20cc010 chore: Release 2025-08-19 16:58:12 -07:00
7f1f61dc7d server: cargo fmt 2025-08-19 16:57:44 -07:00
6ca2459034 server: highlight today's date on the calendar widget 2025-08-19 16:57:26 -07:00
ea60cce86b server: extract calendar info and render widget on email w/o ics 2025-08-19 16:49:04 -07:00
b4113cb59a server: fmt html 2025-08-19 16:23:52 -07:00
f0493d165d server: minor style cleanup for calendar rendering 2025-08-19 16:19:56 -07:00
43d856ae7e server: move calendar widget to askama 2025-08-19 16:17:22 -07:00
5b48c5dbc3 server: move calendar rendering to askama template 2025-08-19 13:26:33 -07:00
d16c221995 server: cleanup calendar summary on mobile 2025-08-19 12:41:46 -07:00
00ce9267c1 server: improved calendar widget rendering 2025-08-19 12:04:42 -07:00
8acf541d53 server: remove excess logging 2025-08-19 12:04:29 -07:00
49e93829dd server: include a calendar widget showing the calendar event 2025-08-19 11:22:31 -07:00
a8a5089ed3 server: render calendar summary before any pre-existing text 2025-08-19 11:17:11 -07:00
cc994df4e5 server: only render text/calendar summary table on calendar invites 2025-08-19 11:09:30 -07:00
d143b2715d server: add ics testdata 2025-08-19 09:56:59 -07:00
c2428c073c server: broken parsing of google ics 2025-08-19 09:51:58 -07:00
574de65c35 server: handle application/* as an attachment 2025-08-18 12:11:31 -07:00
834e873862 chore: Release
Some checks failed
Continuous integration / Check (push) Successful in 1m0s
Continuous integration / Test Suite (push) Failing after 1m36s
Continuous integration / Trunk (push) Successful in 54s
Continuous integration / Rustfmt (push) Failing after 38s
Continuous integration / build (push) Successful in 1m35s
Continuous integration / Disallow unused dependencies (push) Failing after 27m45s
2025-08-18 10:16:15 -07:00
6c07b18eec server: add envelope_to support to DMARC report 2025-08-18 10:15:17 -07:00
b191bcbddf chore: Release
Some checks failed
Continuous integration / Check (push) Successful in 51s
Continuous integration / Test Suite (push) Failing after 1m10s
Continuous integration / Trunk (push) Successful in 1m19s
Continuous integration / Rustfmt (push) Failing after 40s
Continuous integration / build (push) Successful in 1m45s
Continuous integration / Disallow unused dependencies (push) Failing after 27m44s
2025-08-15 14:02:20 -07:00
a1be436209 server: address lint 2025-08-15 14:01:14 -07:00
5b471b278c server: fix tests 2025-08-15 13:58:53 -07:00
34bda32e30 chore: Release
Some checks failed
Continuous integration / Check (push) Successful in 49s
Continuous integration / Test Suite (push) Failing after 1m5s
Continuous integration / Trunk (push) Successful in 7m19s
Continuous integration / Rustfmt (push) Failing after 40s
Continuous integration / build (push) Successful in 1m28s
Continuous integration / Disallow unused dependencies (push) Failing after 27m34s
2025-08-13 16:07:44 -07:00
501ee417c9 server: address lint 2025-08-13 16:07:35 -07:00
ecc0a88341 chore: Release 2025-08-13 16:05:02 -07:00
d36d508df0 server: move email extraction code into separate mod 2025-08-13 10:36:50 -07:00
b9b12dd717 chore: Release
Some checks failed
Continuous integration / Check (push) Successful in 57s
Continuous integration / Test Suite (push) Successful in 1m14s
Continuous integration / Trunk (push) Successful in 1m55s
Continuous integration / Rustfmt (push) Failing after 42s
Continuous integration / build (push) Successful in 1m42s
Continuous integration / Disallow unused dependencies (push) Failing after 27m35s
2025-08-12 17:04:27 -07:00
633e055472 cargo sqlx prepare 2025-08-12 17:04:25 -07:00
951ee70279 server: don't duplicate dmarc table for google 2025-08-12 17:04:03 -07:00
3a41ab1767 server: much improved xmls pretty printer 2025-08-12 17:04:03 -07:00
5c9955a89e server: fix raw dmarc extraction for non-Google domains 2025-08-12 17:04:03 -07:00
1f75627fd2 server: fix is_dmarc check 2025-08-12 17:04:03 -07:00
5c42d04598 server: pretty print raw TLSRPT and DMARC data 2025-08-12 17:04:03 -07:00
4d888fbea3 server: more TLS report support and minor refactoring 2025-08-12 17:04:03 -07:00
8f53678e53 server: TLS report support 2025-08-12 17:04:03 -07:00
8218fca2ef server: include reason in dmarc report 2025-08-12 17:04:03 -07:00
01164d6afa Merge pull request 'fix(deps): update all non-major dependencies' (#148) from renovate/all-minor-patch into master
Some checks failed
Continuous integration / Check (push) Successful in 56s
Continuous integration / Test Suite (push) Successful in 1m14s
Continuous integration / Trunk (push) Successful in 7m29s
Continuous integration / Rustfmt (push) Successful in 40s
Continuous integration / build (push) Successful in 1m23s
Continuous integration / Disallow unused dependencies (push) Failing after 27m44s
2025-08-11 18:15:48 -07:00
2f06ae93ae fix(deps): update all non-major dependencies
All checks were successful
Continuous integration / Check (push) Successful in 1m9s
Continuous integration / Test Suite (push) Successful in 1m49s
Continuous integration / Trunk (push) Successful in 1m0s
Continuous integration / Rustfmt (push) Successful in 40s
Continuous integration / build (push) Successful in 2m6s
Continuous integration / Disallow unused dependencies (push) Successful in 2m12s
2025-08-11 23:32:19 +00:00
75d4fe49e2 chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 47s
Continuous integration / Test Suite (push) Successful in 1m3s
Continuous integration / Trunk (push) Successful in 59s
Continuous integration / Rustfmt (push) Successful in 39s
Continuous integration / build (push) Successful in 1m22s
Continuous integration / Disallow unused dependencies (push) Successful in 2m14s
2025-08-11 16:20:48 -07:00
9f2016940b Merge pull request 'fix(deps): update all non-major dependencies' (#147) from renovate/all-minor-patch into master
Some checks failed
Continuous integration / Check (push) Has been cancelled
Continuous integration / Test Suite (push) Has been cancelled
Continuous integration / Trunk (push) Has been cancelled
Continuous integration / Rustfmt (push) Has been cancelled
Continuous integration / build (push) Has been cancelled
Continuous integration / Disallow unused dependencies (push) Has been cancelled
2025-08-11 16:00:57 -07:00
ba9cc0127b fix(deps): update all non-major dependencies
All checks were successful
Continuous integration / Check (push) Successful in 1m3s
Continuous integration / Test Suite (push) Successful in 1m27s
Continuous integration / Trunk (push) Successful in 1m3s
Continuous integration / Rustfmt (push) Successful in 40s
Continuous integration / build (push) Successful in 1m55s
Continuous integration / Disallow unused dependencies (push) Successful in 2m9s
2025-08-11 22:17:25 +00:00
ce17c4a7d8 chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 54s
Continuous integration / Test Suite (push) Successful in 1m24s
Continuous integration / Trunk (push) Successful in 58s
Continuous integration / Rustfmt (push) Successful in 41s
Continuous integration / build (push) Successful in 1m22s
Continuous integration / Disallow unused dependencies (push) Successful in 2m12s
2025-08-11 14:54:16 -07:00
c8850404b8 server: rework dmarc parsing to use askama 2025-08-11 14:53:12 -07:00
638e94b4ae web: create seperate email overrides CSS file 2025-08-11 12:42:45 -07:00
d0f4716d83 server: add gzip dmarc email support
Some checks failed
Continuous integration / Check (push) Failing after 59s
Continuous integration / Test Suite (push) Failing after 1m32s
Continuous integration / Trunk (push) Failing after 47s
Continuous integration / Rustfmt (push) Failing after 35s
Continuous integration / build (push) Failing after 1m39s
Continuous integration / Disallow unused dependencies (push) Failing after 2m5s
2025-08-11 12:41:25 -07:00
59e35062e7 server: handle application/zip for google dmarc 2025-08-11 12:41:03 -07:00
43827b4d87 Merge pull request 'fix(deps): update rust crate uuid to v1.18.0' (#145) from renovate/all-minor-patch into master
All checks were successful
Continuous integration / Check (push) Successful in 54s
Continuous integration / Test Suite (push) Successful in 1m2s
Continuous integration / Trunk (push) Successful in 56s
Continuous integration / Rustfmt (push) Successful in 40s
Continuous integration / build (push) Successful in 1m21s
Continuous integration / Disallow unused dependencies (push) Successful in 2m3s
2025-08-11 03:45:52 -07:00
b29e92cd9c fix(deps): update rust crate uuid to v1.18.0
All checks were successful
Continuous integration / Check (push) Successful in 56s
Continuous integration / Test Suite (push) Successful in 1m4s
Continuous integration / Trunk (push) Successful in 54s
Continuous integration / Rustfmt (push) Successful in 38s
Continuous integration / build (push) Successful in 1m23s
Continuous integration / Disallow unused dependencies (push) Successful in 2m4s
2025-08-11 10:31:27 +00:00
42bea43de9 Merge pull request 'chore(deps): lock file maintenance' (#144) from renovate/lock-file-maintenance into master
All checks were successful
Continuous integration / Check (push) Successful in 54s
Continuous integration / Test Suite (push) Successful in 1m1s
Continuous integration / Trunk (push) Successful in 7m34s
Continuous integration / Rustfmt (push) Successful in 39s
Continuous integration / build (push) Successful in 1m19s
Continuous integration / Disallow unused dependencies (push) Successful in 2m2s
2025-08-10 18:01:35 -07:00
4048edde11 chore(deps): lock file maintenance
All checks were successful
Continuous integration / Check (push) Successful in 2m9s
Continuous integration / Test Suite (push) Successful in 4m15s
Continuous integration / Trunk (push) Successful in 7m9s
Continuous integration / Rustfmt (push) Successful in 44s
Continuous integration / build (push) Successful in 4m30s
Continuous integration / Disallow unused dependencies (push) Successful in 2m7s
2025-08-11 00:01:53 +00:00
90768d0d1b Merge pull request 'fix(deps): update rust crate clap to v4.5.43' (#143) from renovate/all-minor-patch into master
All checks were successful
Continuous integration / Check (push) Successful in 1m4s
Continuous integration / Test Suite (push) Successful in 2m18s
Continuous integration / Trunk (push) Successful in 1m3s
Continuous integration / Rustfmt (push) Successful in 51s
Continuous integration / build (push) Successful in 2m10s
Continuous integration / Disallow unused dependencies (push) Successful in 2m3s
2025-08-06 10:15:44 -07:00
70e6271ca3 fix(deps): update rust crate clap to v4.5.43
All checks were successful
Continuous integration / Check (push) Successful in 1m30s
Continuous integration / Test Suite (push) Successful in 1m49s
Continuous integration / Trunk (push) Successful in 7m31s
Continuous integration / Rustfmt (push) Successful in 38s
Continuous integration / build (push) Successful in 3m21s
Continuous integration / Disallow unused dependencies (push) Successful in 2m14s
2025-08-06 16:46:18 +00:00
0bda21e5e9 Merge pull request 'chore(deps): lock file maintenance' (#142) from renovate/lock-file-maintenance into master
All checks were successful
Continuous integration / Check (push) Successful in 1m26s
Continuous integration / Test Suite (push) Successful in 1m57s
Continuous integration / Trunk (push) Successful in 7m55s
Continuous integration / Rustfmt (push) Successful in 38s
Continuous integration / build (push) Successful in 2m17s
Continuous integration / Disallow unused dependencies (push) Successful in 2m14s
2025-08-03 18:01:42 -07:00
f987b4e4b4 chore(deps): lock file maintenance
All checks were successful
Continuous integration / Check (push) Successful in 1m33s
Continuous integration / Test Suite (push) Successful in 3m31s
Continuous integration / Trunk (push) Successful in 7m29s
Continuous integration / Rustfmt (push) Successful in 40s
Continuous integration / build (push) Successful in 3m25s
Continuous integration / Disallow unused dependencies (push) Successful in 2m16s
2025-08-04 00:01:42 +00:00
a873ec9208 Merge pull request 'fix(deps): update rust crate tokio to v1.47.1' (#141) from renovate/all-minor-patch into master
All checks were successful
Continuous integration / Check (push) Successful in 1m29s
Continuous integration / Test Suite (push) Successful in 2m22s
Continuous integration / Trunk (push) Successful in 1m18s
Continuous integration / Rustfmt (push) Successful in 55s
Continuous integration / build (push) Successful in 2m2s
Continuous integration / Disallow unused dependencies (push) Successful in 2m15s
2025-08-01 05:16:00 -07:00
d8d26e1f59 fix(deps): update rust crate tokio to v1.47.1
All checks were successful
Continuous integration / Check (push) Successful in 1m17s
Continuous integration / Test Suite (push) Successful in 2m34s
Continuous integration / Trunk (push) Successful in 7m13s
Continuous integration / Rustfmt (push) Successful in 42s
Continuous integration / build (push) Successful in 2m40s
Continuous integration / Disallow unused dependencies (push) Successful in 2m18s
2025-08-01 11:46:16 +00:00
1322dde5c5 Merge pull request 'fix(deps): update rust crate serde_json to v1.0.142' (#140) from renovate/all-minor-patch into master
All checks were successful
Continuous integration / Check (push) Successful in 1m6s
Continuous integration / Test Suite (push) Successful in 3m3s
Continuous integration / Trunk (push) Successful in 55s
Continuous integration / Rustfmt (push) Successful in 41s
Continuous integration / build (push) Successful in 2m12s
Continuous integration / Disallow unused dependencies (push) Successful in 2m2s
2025-07-31 17:45:58 -07:00
a2147081e8 fix(deps): update rust crate serde_json to v1.0.142
All checks were successful
Continuous integration / Check (push) Successful in 1m14s
Continuous integration / Test Suite (push) Successful in 2m26s
Continuous integration / Trunk (push) Successful in 7m15s
Continuous integration / Rustfmt (push) Successful in 45s
Continuous integration / build (push) Successful in 3m34s
Continuous integration / Disallow unused dependencies (push) Successful in 2m3s
2025-08-01 00:01:45 +00:00
8c6a24e400 Merge pull request 'fix(deps): update rust crate clap to v4.5.42' (#139) from renovate/all-minor-patch into master
All checks were successful
Continuous integration / Check (push) Successful in 1m7s
Continuous integration / Test Suite (push) Successful in 1m55s
Continuous integration / Trunk (push) Successful in 53s
Continuous integration / Rustfmt (push) Successful in 38s
Continuous integration / build (push) Successful in 1m58s
Continuous integration / Disallow unused dependencies (push) Successful in 2m4s
2025-07-29 20:30:48 -07:00
8a08d97930 fix(deps): update rust crate clap to v4.5.42
All checks were successful
Continuous integration / Check (push) Successful in 1m6s
Continuous integration / Test Suite (push) Successful in 1m39s
Continuous integration / Trunk (push) Successful in 7m17s
Continuous integration / Rustfmt (push) Successful in 46s
Continuous integration / build (push) Successful in 2m1s
Continuous integration / Disallow unused dependencies (push) Successful in 2m12s
2025-07-30 03:01:20 +00:00
d24a851cd7 Merge pull request 'chore(deps): lock file maintenance' (#138) from renovate/lock-file-maintenance into master
All checks were successful
Continuous integration / Check (push) Successful in 1m27s
Continuous integration / Test Suite (push) Successful in 2m8s
Continuous integration / Trunk (push) Successful in 7m13s
Continuous integration / Rustfmt (push) Successful in 38s
Continuous integration / build (push) Successful in 2m23s
Continuous integration / Disallow unused dependencies (push) Successful in 2m38s
2025-07-27 17:31:34 -07:00
f6ff597f66 chore(deps): lock file maintenance
All checks were successful
Continuous integration / Check (push) Successful in 1m1s
Continuous integration / Test Suite (push) Successful in 3m28s
Continuous integration / Trunk (push) Successful in 7m21s
Continuous integration / Rustfmt (push) Successful in 45s
Continuous integration / build (push) Successful in 2m29s
Continuous integration / Disallow unused dependencies (push) Successful in 2m14s
2025-07-28 00:01:34 +00:00
387d133f09 Merge pull request 'fix(deps): update rust crate css-inline to 0.17.0' (#137) from renovate/all-minor-patch into master
All checks were successful
Continuous integration / Check (push) Successful in 1m0s
Continuous integration / Test Suite (push) Successful in 1m31s
Continuous integration / Trunk (push) Successful in 1m23s
Continuous integration / Rustfmt (push) Successful in 39s
Continuous integration / build (push) Successful in 1m56s
Continuous integration / Disallow unused dependencies (push) Successful in 2m17s
2025-07-26 13:15:52 -07:00
a9674e8b7b fix(deps): update rust crate css-inline to 0.17.0
All checks were successful
Continuous integration / Check (push) Successful in 58s
Continuous integration / Test Suite (push) Successful in 1m26s
Continuous integration / Trunk (push) Successful in 1m10s
Continuous integration / Rustfmt (push) Successful in 45s
Continuous integration / build (push) Successful in 2m58s
Continuous integration / Disallow unused dependencies (push) Successful in 2m8s
2025-07-26 20:01:22 +00:00
457f9ac1c2 Merge pull request 'fix(deps): update rust crate tokio to v1.47.0' (#136) from renovate/all-minor-patch into master
All checks were successful
Continuous integration / Check (push) Successful in 1m43s
Continuous integration / Test Suite (push) Successful in 1m35s
Continuous integration / Trunk (push) Successful in 1m0s
Continuous integration / Rustfmt (push) Successful in 1m8s
Continuous integration / build (push) Successful in 2m36s
Continuous integration / Disallow unused dependencies (push) Successful in 2m21s
2025-07-26 09:31:17 -07:00
d62759565f fix(deps): update rust crate tokio to v1.47.0
All checks were successful
Continuous integration / Check (push) Successful in 1m11s
Continuous integration / Test Suite (push) Successful in 2m48s
Continuous integration / Trunk (push) Successful in 7m20s
Continuous integration / Rustfmt (push) Successful in 40s
Continuous integration / build (push) Successful in 2m46s
Continuous integration / Disallow unused dependencies (push) Successful in 2m13s
2025-07-26 15:32:00 +00:00
4fd97700f7 Merge pull request 'chore(deps): lock file maintenance' (#134) from renovate/lock-file-maintenance into master
All checks were successful
Continuous integration / Check (push) Successful in 1m31s
Continuous integration / Test Suite (push) Successful in 2m15s
Continuous integration / Trunk (push) Successful in 7m55s
Continuous integration / Rustfmt (push) Successful in 40s
Continuous integration / build (push) Successful in 2m27s
Continuous integration / Disallow unused dependencies (push) Successful in 2m9s
2025-07-20 17:31:41 -07:00
99b9a88663 chore(deps): lock file maintenance
All checks were successful
Continuous integration / Check (push) Successful in 1m24s
Continuous integration / Test Suite (push) Successful in 2m26s
Continuous integration / Trunk (push) Successful in 1m4s
Continuous integration / Rustfmt (push) Successful in 38s
Continuous integration / build (push) Successful in 3m2s
Continuous integration / Disallow unused dependencies (push) Successful in 2m3s
2025-07-21 00:01:49 +00:00
56e6036892 Merge pull request 'fix(deps): update rust crate strum_macros to v0.27.2' (#133) from renovate/strum-monorepo into master
All checks were successful
Continuous integration / Check (push) Successful in 57s
Continuous integration / Test Suite (push) Successful in 1m14s
Continuous integration / Trunk (push) Successful in 51s
Continuous integration / Rustfmt (push) Successful in 39s
Continuous integration / build (push) Successful in 1m39s
Continuous integration / Disallow unused dependencies (push) Successful in 2m7s
2025-07-20 10:45:48 -07:00
232e436378 fix(deps): update rust crate strum_macros to v0.27.2
All checks were successful
Continuous integration / Check (push) Successful in 57s
Continuous integration / Test Suite (push) Successful in 1m28s
Continuous integration / Trunk (push) Successful in 51s
Continuous integration / Rustfmt (push) Successful in 39s
Continuous integration / build (push) Successful in 1m49s
Continuous integration / Disallow unused dependencies (push) Successful in 2m4s
2025-07-20 16:46:45 +00:00
e2bf4d890f chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 1m5s
Continuous integration / Test Suite (push) Successful in 1m24s
Continuous integration / Trunk (push) Successful in 59s
Continuous integration / Rustfmt (push) Successful in 41s
Continuous integration / build (push) Successful in 1m37s
Continuous integration / Disallow unused dependencies (push) Successful in 2m3s
2025-07-20 09:32:44 -07:00
e9584785a8 web: address -D warning error 2025-07-20 09:32:22 -07:00
7a4d2abdd5 Merge branch 'renovate/all-minor-patch'
Some checks failed
Continuous integration / Check (push) Successful in 54s
Continuous integration / Test Suite (push) Successful in 1m40s
Continuous integration / Trunk (push) Failing after 46s
Continuous integration / Rustfmt (push) Successful in 41s
Continuous integration / build (push) Successful in 1m39s
Continuous integration / Disallow unused dependencies (push) Successful in 2m4s
2025-07-18 16:07:20 -07:00
b764d725b1 fix(deps): update all non-major dependencies
Some checks failed
Continuous integration / Check (push) Successful in 1m18s
Continuous integration / Test Suite (push) Successful in 1m59s
Continuous integration / Trunk (push) Failing after 58s
Continuous integration / Rustfmt (push) Successful in 38s
Continuous integration / build (push) Successful in 3m24s
Continuous integration / Disallow unused dependencies (push) Successful in 2m5s
2025-07-18 19:01:31 +00:00
7bac98762c Merge pull request 'chore(deps): lock file maintenance' (#131) from renovate/lock-file-maintenance into master
Some checks failed
Continuous integration / Check (push) Successful in 1m41s
Continuous integration / Test Suite (push) Successful in 3m3s
Continuous integration / Trunk (push) Failing after 7m47s
Continuous integration / Rustfmt (push) Successful in 52s
Continuous integration / build (push) Successful in 1m59s
Continuous integration / Disallow unused dependencies (push) Successful in 2m15s
2025-07-13 17:46:50 -07:00
2bedd92e1a chore(deps): lock file maintenance
All checks were successful
Continuous integration / Check (push) Successful in 1m31s
Continuous integration / Test Suite (push) Successful in 2m25s
Continuous integration / Trunk (push) Successful in 7m43s
Continuous integration / Rustfmt (push) Successful in 39s
Continuous integration / build (push) Successful in 2m22s
Continuous integration / Disallow unused dependencies (push) Successful in 2m21s
2025-07-14 00:01:43 +00:00
da72c09fa3 Merge pull request 'fix(deps): update rust crate clap to v4.5.41' (#130) from renovate/all-minor-patch into master
All checks were successful
Continuous integration / Check (push) Successful in 55s
Continuous integration / Test Suite (push) Successful in 1m59s
Continuous integration / Trunk (push) Successful in 58s
Continuous integration / Rustfmt (push) Successful in 39s
Continuous integration / build (push) Successful in 1m57s
Continuous integration / Disallow unused dependencies (push) Successful in 2m2s
2025-07-09 16:00:58 -07:00
38c1942ebb fix(deps): update rust crate clap to v4.5.41
All checks were successful
Continuous integration / Check (push) Successful in 58s
Continuous integration / Test Suite (push) Successful in 1m33s
Continuous integration / Trunk (push) Successful in 7m15s
Continuous integration / Rustfmt (push) Successful in 40s
Continuous integration / build (push) Successful in 1m41s
Continuous integration / Disallow unused dependencies (push) Successful in 2m5s
2025-07-09 22:46:22 +00:00
05a7386dd1 Merge pull request 'fix(deps): update rust crate ammonia to v4.1.1' (#129) from renovate/all-minor-patch into master
All checks were successful
Continuous integration / Check (push) Successful in 47s
Continuous integration / Test Suite (push) Successful in 1m16s
Continuous integration / Trunk (push) Successful in 47s
Continuous integration / Rustfmt (push) Successful in 39s
Continuous integration / build (push) Successful in 1m32s
Continuous integration / Disallow unused dependencies (push) Successful in 2m4s
2025-07-08 10:00:50 -07:00
477ffe8d82 fix(deps): update rust crate ammonia to v4.1.1
All checks were successful
Continuous integration / Check (push) Successful in 52s
Continuous integration / Test Suite (push) Successful in 1m43s
Continuous integration / Trunk (push) Successful in 7m20s
Continuous integration / Rustfmt (push) Successful in 40s
Continuous integration / build (push) Successful in 2m57s
Continuous integration / Disallow unused dependencies (push) Successful in 2m11s
2025-07-08 16:31:39 +00:00
5d80f32b49 Merge pull request 'chore(deps): lock file maintenance' (#128) from renovate/lock-file-maintenance into master
All checks were successful
Continuous integration / Check (push) Successful in 1m44s
Continuous integration / Test Suite (push) Successful in 2m53s
Continuous integration / Trunk (push) Successful in 7m38s
Continuous integration / Rustfmt (push) Successful in 35s
Continuous integration / build (push) Successful in 2m24s
Continuous integration / Disallow unused dependencies (push) Successful in 2m30s
2025-07-06 17:46:42 -07:00
ae76bdf9a5 chore(deps): lock file maintenance
All checks were successful
Continuous integration / Check (push) Successful in 1m16s
Continuous integration / Test Suite (push) Successful in 2m5s
Continuous integration / Trunk (push) Successful in 7m15s
Continuous integration / Rustfmt (push) Successful in 45s
Continuous integration / build (push) Successful in 2m24s
Continuous integration / Disallow unused dependencies (push) Successful in 2m9s
2025-07-07 00:01:43 +00:00
50e3c77e49 Merge pull request 'fix(deps): update rust crate tokio to v1.46.1' (#127) from renovate/all-minor-patch into master
All checks were successful
Continuous integration / Check (push) Successful in 1m21s
Continuous integration / Test Suite (push) Successful in 1m27s
Continuous integration / Trunk (push) Successful in 1m10s
Continuous integration / Rustfmt (push) Successful in 56s
Continuous integration / build (push) Successful in 1m32s
Continuous integration / Disallow unused dependencies (push) Successful in 2m59s
2025-07-04 14:01:06 -07:00
e85a505775 fix(deps): update rust crate tokio to v1.46.1
All checks were successful
Continuous integration / Check (push) Successful in 1m4s
Continuous integration / Test Suite (push) Successful in 1m52s
Continuous integration / Trunk (push) Successful in 7m18s
Continuous integration / Rustfmt (push) Successful in 45s
Continuous integration / build (push) Successful in 2m25s
Continuous integration / Disallow unused dependencies (push) Successful in 2m8s
2025-07-04 20:16:47 +00:00
86ea5a13f3 fix(deps): update rust crate tokio to v1.46.0
All checks were successful
Continuous integration / Check (push) Successful in 59s
Continuous integration / Test Suite (push) Successful in 1m16s
Continuous integration / Trunk (push) Successful in 59s
Continuous integration / Rustfmt (push) Successful in 50s
Continuous integration / build (push) Successful in 1m23s
Continuous integration / Disallow unused dependencies (push) Successful in 2m21s
2025-07-02 08:31:20 +00:00
a30bff925f fix(deps): update rust crate reqwest to v0.12.22
All checks were successful
Continuous integration / Check (push) Successful in 43s
Continuous integration / Test Suite (push) Successful in 57s
Continuous integration / Trunk (push) Successful in 53s
Continuous integration / Rustfmt (push) Successful in 35s
Continuous integration / build (push) Successful in 1m19s
Continuous integration / Disallow unused dependencies (push) Successful in 2m8s
2025-07-01 18:31:21 +00:00
6fdfbb1ee2 Merge branch 'renovate/all-minor-patch'
All checks were successful
Continuous integration / Check (push) Successful in 51s
Continuous integration / Test Suite (push) Successful in 1m0s
Continuous integration / Trunk (push) Successful in 1m4s
Continuous integration / Rustfmt (push) Successful in 38s
Continuous integration / build (push) Successful in 1m17s
Continuous integration / Disallow unused dependencies (push) Successful in 2m6s
2025-07-01 11:24:26 -07:00
561316ddd4 web: fix letterbox-shared package reference in Cargo.toml 2025-07-01 11:23:41 -07:00
495e495888 fix(deps): update all non-major dependencies
Some checks failed
renovate/artifacts Artifact file update failure
Continuous integration / Rustfmt (push) Waiting to run
Continuous integration / build (push) Waiting to run
Continuous integration / Disallow unused dependencies (push) Waiting to run
Continuous integration / Check (push) Successful in 51s
Continuous integration / Test Suite (push) Successful in 1m2s
Continuous integration / Trunk (push) Has been cancelled
2025-07-01 15:56:35 +00:00
ddb4c812ce chore(deps): lock file maintenance
All checks were successful
Continuous integration / Check (push) Successful in 48s
Continuous integration / Test Suite (push) Successful in 1m4s
Continuous integration / Trunk (push) Successful in 7m40s
Continuous integration / Rustfmt (push) Successful in 41s
Continuous integration / build (push) Successful in 1m26s
Continuous integration / Disallow unused dependencies (push) Successful in 2m13s
2025-06-30 00:01:45 +00:00
1aaf914ac5 chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 1m12s
Continuous integration / Test Suite (push) Successful in 2m0s
Continuous integration / Trunk (push) Successful in 1m13s
Continuous integration / Rustfmt (push) Successful in 51s
Continuous integration / build (push) Successful in 3m1s
Continuous integration / Disallow unused dependencies (push) Successful in 2m23s
2025-06-23 13:49:28 -07:00
982b5dae2f server: add disabled column to feed table
All checks were successful
Continuous integration / Check (push) Successful in 45s
Continuous integration / Test Suite (push) Successful in 1m7s
Continuous integration / Trunk (push) Successful in 1m7s
Continuous integration / Rustfmt (push) Successful in 54s
Continuous integration / build (push) Successful in 2m36s
Continuous integration / Disallow unused dependencies (push) Successful in 2m29s
2025-06-23 13:41:11 -07:00
8807c1b1f5 chore(deps): lock file maintenance
All checks were successful
Continuous integration / Check (push) Successful in 1m21s
Continuous integration / Test Suite (push) Successful in 1m32s
Continuous integration / Trunk (push) Successful in 1m19s
Continuous integration / Rustfmt (push) Successful in 1m4s
Continuous integration / build (push) Successful in 2m35s
Continuous integration / Disallow unused dependencies (push) Successful in 2m52s
2025-06-23 19:37:51 +00:00
fa23658ef0 web: remove now obsolete allow directive
All checks were successful
Continuous integration / Check (push) Successful in 1m21s
Continuous integration / Test Suite (push) Successful in 1m31s
Continuous integration / Trunk (push) Successful in 1m20s
Continuous integration / Rustfmt (push) Successful in 1m3s
Continuous integration / build (push) Successful in 3m30s
Continuous integration / Disallow unused dependencies (push) Successful in 2m41s
2025-06-23 12:32:23 -07:00
f175faed98 fix(deps): update rust crate css-inline to v0.14.5
All checks were successful
Continuous integration / Check (push) Successful in 39s
Continuous integration / Test Suite (push) Successful in 1m1s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / Rustfmt (push) Successful in 33s
Continuous integration / build (push) Successful in 53s
Continuous integration / Disallow unused dependencies (push) Successful in 2m2s
2025-06-16 21:46:30 +00:00
8971c16117 chore(deps): lock file maintenance
All checks were successful
Continuous integration / Check (push) Successful in 39s
Continuous integration / Test Suite (push) Successful in 50s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / Trunk (push) Successful in 50s
Continuous integration / build (push) Successful in 54s
Continuous integration / Disallow unused dependencies (push) Successful in 2m2s
2025-06-16 00:01:44 +00:00
fbecf564b5 fix(deps): update rust crate reqwest to v0.12.20
All checks were successful
Continuous integration / Check (push) Successful in 37s
Continuous integration / Test Suite (push) Successful in 43s
Continuous integration / Trunk (push) Successful in 37s
Continuous integration / Rustfmt (push) Successful in 37s
Continuous integration / build (push) Successful in 53s
Continuous integration / Disallow unused dependencies (push) Successful in 2m0s
2025-06-10 19:16:14 +00:00
e5643c6fd0 fix(deps): update rust crate clap to v4.5.40
All checks were successful
Continuous integration / Test Suite (push) Successful in 45s
Continuous integration / Trunk (push) Successful in 37s
Continuous integration / Check (push) Successful in 1m30s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / Disallow unused dependencies (push) Successful in 54s
Continuous integration / build (push) Successful in 1m50s
2025-06-09 18:31:15 +00:00
a8734269f7 chore(deps): lock file maintenance
All checks were successful
Continuous integration / Check (push) Successful in 40s
Continuous integration / Test Suite (push) Successful in 44s
Continuous integration / Rustfmt (push) Successful in 33s
Continuous integration / Trunk (push) Successful in 52s
Continuous integration / build (push) Successful in 51s
Continuous integration / Disallow unused dependencies (push) Successful in 2m2s
2025-06-09 00:01:43 +00:00
cab4e571f3 fix(deps): update all non-major dependencies
All checks were successful
Continuous integration / Check (push) Successful in 37s
Continuous integration / Test Suite (push) Successful in 1m13s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / Rustfmt (push) Successful in 28s
Continuous integration / build (push) Successful in 52s
Continuous integration / Disallow unused dependencies (push) Successful in 1m57s
2025-06-03 13:16:29 +00:00
4d6c6af7d9 fix(deps): update all non-major dependencies
All checks were successful
Continuous integration / Check (push) Successful in 38s
Continuous integration / Test Suite (push) Successful in 43s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / Rustfmt (push) Successful in 32s
Continuous integration / Disallow unused dependencies (push) Successful in 55s
Continuous integration / build (push) Successful in 1m44s
2025-06-02 12:47:12 +00:00
cf08831ed1 chore(deps): lock file maintenance
All checks were successful
Continuous integration / Check (push) Successful in 38s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / build (push) Successful in 50s
Continuous integration / Disallow unused dependencies (push) Successful in 55s
Continuous integration / Test Suite (push) Successful in 3m48s
2025-06-02 03:32:02 +00:00
e1509c5978 fix(deps): update all non-major dependencies
All checks were successful
Continuous integration / Check (push) Successful in 1m5s
Continuous integration / Test Suite (push) Successful in 44s
Continuous integration / Rustfmt (push) Successful in 32s
Continuous integration / Trunk (push) Successful in 1m18s
Continuous integration / build (push) Successful in 51s
Continuous integration / Disallow unused dependencies (push) Successful in 2m24s
2025-06-01 20:31:35 -07:00
13db8e6f1f chore(deps): lock file maintenance
All checks were successful
Continuous integration / Test Suite (push) Successful in 43s
Continuous integration / Check (push) Successful in 1m0s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / Rustfmt (push) Successful in 48s
Continuous integration / build (push) Successful in 53s
Continuous integration / Disallow unused dependencies (push) Successful in 2m0s
2025-06-02 02:46:35 +00:00
136a837fa4 chore(deps): lock file maintenance
All checks were successful
Continuous integration / Check (push) Successful in 46s
Continuous integration / Test Suite (push) Successful in 43s
Continuous integration / Rustfmt (push) Successful in 32s
Continuous integration / build (push) Successful in 1m8s
Continuous integration / Disallow unused dependencies (push) Successful in 54s
Continuous integration / Trunk (push) Successful in 7m14s
2025-06-02 00:01:42 +00:00
1ea058c664 fix(deps): update all non-major dependencies
All checks were successful
Continuous integration / Check (push) Successful in 40s
Continuous integration / Test Suite (push) Successful in 41s
Continuous integration / Trunk (push) Successful in 37s
Continuous integration / Rustfmt (push) Successful in 39s
Continuous integration / build (push) Successful in 49s
Continuous integration / Disallow unused dependencies (push) Successful in 2m0s
2025-05-28 16:16:24 +00:00
f4c11c5b3f fix(deps): update all non-major dependencies
All checks were successful
Continuous integration / Check (push) Successful in 40s
Continuous integration / Test Suite (push) Successful in 41s
Continuous integration / Trunk (push) Successful in 37s
Continuous integration / Rustfmt (push) Successful in 39s
Continuous integration / build (push) Successful in 53s
Continuous integration / Disallow unused dependencies (push) Successful in 2m1s
2025-05-28 13:01:55 +00:00
8dc8f3a0f8 chore(deps): lock file maintenance
All checks were successful
Continuous integration / Check (push) Successful in 38s
Continuous integration / Test Suite (push) Successful in 41s
Continuous integration / Rustfmt (push) Successful in 40s
Continuous integration / build (push) Successful in 2m56s
Continuous integration / Trunk (push) Successful in 3m43s
Continuous integration / Disallow unused dependencies (push) Successful in 2m2s
2025-05-26 00:01:31 +00:00
7b9450b65b fix(deps): update all non-major dependencies
All checks were successful
Continuous integration / Check (push) Successful in 54s
Continuous integration / Test Suite (push) Successful in 1m5s
Continuous integration / Trunk (push) Successful in 50s
Continuous integration / Rustfmt (push) Successful in 38s
Continuous integration / build (push) Successful in 1m23s
Continuous integration / Disallow unused dependencies (push) Successful in 1m51s
2025-05-24 14:47:03 +00:00
b5de0719dd fix(deps): update all non-major dependencies
All checks were successful
Continuous integration / Test Suite (push) Successful in 47s
Continuous integration / Check (push) Successful in 58s
Continuous integration / Trunk (push) Successful in 37s
Continuous integration / Rustfmt (push) Successful in 39s
Continuous integration / build (push) Successful in 52s
Continuous integration / Disallow unused dependencies (push) Successful in 2m0s
2025-05-24 02:31:52 +00:00
58da28a19b fix(deps): update all non-major dependencies
All checks were successful
Continuous integration / Test Suite (push) Successful in 39s
Continuous integration / Check (push) Successful in 51s
Continuous integration / Trunk (push) Successful in 37s
Continuous integration / Rustfmt (push) Successful in 35s
Continuous integration / build (push) Successful in 47s
Continuous integration / Disallow unused dependencies (push) Successful in 2m1s
2025-05-23 23:31:44 +00:00
75ad27ec2f chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 35s
Continuous integration / Test Suite (push) Successful in 40s
Continuous integration / Trunk (push) Successful in 37s
Continuous integration / Rustfmt (push) Successful in 44s
Continuous integration / build (push) Successful in 48s
Continuous integration / Disallow unused dependencies (push) Successful in 2m5s
2025-05-23 16:22:27 -07:00
f904fa0001 Add slurp and CSS for seiya-me 2025-05-23 16:21:57 -07:00
b94596bf65 fix(deps): update all non-major dependencies
All checks were successful
Continuous integration / Check (push) Successful in 37s
Continuous integration / Trunk (push) Successful in 37s
Continuous integration / Test Suite (push) Successful in 1m13s
Continuous integration / Rustfmt (push) Successful in 32s
Continuous integration / build (push) Successful in 1m19s
Continuous integration / Disallow unused dependencies (push) Successful in 57s
2025-05-22 15:01:32 +00:00
aa24599921 chore(deps): lock file maintenance
All checks were successful
Continuous integration / Check (push) Successful in 41s
Continuous integration / Test Suite (push) Successful in 40s
Continuous integration / Trunk (push) Successful in 41s
Continuous integration / Rustfmt (push) Successful in 38s
Continuous integration / build (push) Successful in 47s
Continuous integration / Disallow unused dependencies (push) Successful in 2m12s
2025-05-19 00:01:49 +00:00
c81a8c1cd3 chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 35s
Continuous integration / Test Suite (push) Successful in 40s
Continuous integration / Trunk (push) Successful in 37s
Continuous integration / Rustfmt (push) Successful in 39s
Continuous integration / build (push) Successful in 47s
Continuous integration / Disallow unused dependencies (push) Successful in 2m5s
2025-05-18 09:54:26 -07:00
7c3cfec3d1 web: improve keep unread logic in catchup, remove execess logging 2025-05-18 09:54:03 -07:00
a2920fde3b chore(deps): lock file maintenance
All checks were successful
Continuous integration / Check (push) Successful in 42s
Continuous integration / Test Suite (push) Successful in 2m51s
Continuous integration / Rustfmt (push) Successful in 37s
Continuous integration / Trunk (push) Successful in 4m0s
Continuous integration / Disallow unused dependencies (push) Successful in 58s
Continuous integration / build (push) Successful in 3m31s
2025-05-12 00:01:38 +00:00
8bc449ae6e fix(deps): update rust crate clap to v4.5.38
All checks were successful
Continuous integration / Check (push) Successful in 40s
Continuous integration / Trunk (push) Successful in 37s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / Test Suite (push) Successful in 2m0s
Continuous integration / build (push) Successful in 55s
Continuous integration / Disallow unused dependencies (push) Successful in 2m0s
2025-05-11 01:16:28 +00:00
0febd0535a fix(deps): update rust crate tower-http to v0.6.4
All checks were successful
Continuous integration / Check (push) Successful in 43s
Continuous integration / Test Suite (push) Successful in 49s
Continuous integration / Rustfmt (push) Successful in 32s
Continuous integration / build (push) Successful in 55s
Continuous integration / Disallow unused dependencies (push) Successful in 55s
Continuous integration / Trunk (push) Successful in 7m9s
2025-05-10 20:46:27 +00:00
a9e00a54e4 fix(deps): update rust crate tower-http to v0.6.3
All checks were successful
Continuous integration / Check (push) Successful in 1m3s
Continuous integration / Test Suite (push) Successful in 1m6s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / build (push) Successful in 56s
Continuous integration / Disallow unused dependencies (push) Successful in 55s
Continuous integration / Trunk (push) Successful in 7m14s
2025-05-07 19:46:07 +00:00
6811c689ff fix(deps): update rust crate tokio to v1.45.0
All checks were successful
Continuous integration / Check (push) Successful in 40s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / Test Suite (push) Successful in 2m18s
Continuous integration / build (push) Successful in 56s
Continuous integration / Disallow unused dependencies (push) Successful in 2m10s
2025-05-06 06:46:13 +00:00
8ba6b3d0b0 chore(deps): lock file maintenance
All checks were successful
Continuous integration / Check (push) Successful in 50s
Continuous integration / Test Suite (push) Successful in 49s
Continuous integration / Rustfmt (push) Successful in 32s
Continuous integration / build (push) Successful in 1m13s
Continuous integration / Disallow unused dependencies (push) Successful in 58s
Continuous integration / Trunk (push) Successful in 7m14s
2025-05-05 00:01:38 +00:00
a7c5585e80 fix(deps): update rust crate axum to v0.8.4
All checks were successful
Continuous integration / Check (push) Successful in 39s
Continuous integration / Trunk (push) Successful in 37s
Continuous integration / Rustfmt (push) Successful in 32s
Continuous integration / Test Suite (push) Successful in 2m10s
Continuous integration / build (push) Successful in 55s
Continuous integration / Disallow unused dependencies (push) Successful in 1m59s
2025-04-30 16:46:20 +00:00
4ef4d49113 fix(deps): update rust crate chrono to v0.4.41
All checks were successful
Continuous integration / Check (push) Successful in 41s
Continuous integration / Test Suite (push) Successful in 49s
Continuous integration / Rustfmt (push) Successful in 40s
Continuous integration / Trunk (push) Successful in 3m47s
Continuous integration / build (push) Successful in 3m24s
Continuous integration / Disallow unused dependencies (push) Successful in 55s
2025-04-29 09:31:11 +00:00
f8af303110 chore(deps): lock file maintenance
All checks were successful
Continuous integration / Check (push) Successful in 40s
Continuous integration / Test Suite (push) Successful in 1m16s
Continuous integration / Rustfmt (push) Successful in 34s
Continuous integration / build (push) Successful in 1m21s
Continuous integration / Disallow unused dependencies (push) Successful in 55s
Continuous integration / Trunk (push) Successful in 7m21s
2025-04-28 00:01:40 +00:00
fa5aac34ba chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 38s
Continuous integration / Test Suite (push) Successful in 44s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / Trunk (push) Successful in 55s
Continuous integration / build (push) Successful in 51s
Continuous integration / Disallow unused dependencies (push) Successful in 2m7s
2025-04-24 12:03:13 -07:00
b58556254e notmuch: log any stderr output 2025-04-24 12:02:55 -07:00
e365ced7dd server: more concise slice of ids 2025-04-24 12:02:40 -07:00
93d569fb14 chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 38s
Continuous integration / Test Suite (push) Successful in 44s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / Trunk (push) Successful in 51s
Continuous integration / build (push) Successful in 52s
Continuous integration / Disallow unused dependencies (push) Successful in 2m4s
2025-04-24 09:04:42 -07:00
f86a5f464d server: properly limit index 2025-04-24 09:04:22 -07:00
956c20b156 chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 38s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / Test Suite (push) Successful in 1m13s
Continuous integration / Rustfmt (push) Successful in 30s
Continuous integration / Disallow unused dependencies (push) Successful in 56s
Continuous integration / build (push) Successful in 1m34s
2025-04-24 08:56:56 -07:00
1eb498712b server: prevent out of bounds index at end of processing 2025-04-24 08:56:19 -07:00
f12979c0be chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 38s
Continuous integration / Test Suite (push) Successful in 45s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / build (push) Successful in 52s
Continuous integration / Disallow unused dependencies (push) Successful in 55s
Continuous integration / Trunk (push) Successful in 7m10s
2025-04-23 18:59:16 -07:00
4665f34e54 server: label_unprocessed handle case where files cannot be found from message-id 2025-04-23 18:57:54 -07:00
bbdc35061c chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 38s
Continuous integration / Test Suite (push) Successful in 45s
Continuous integration / Trunk (push) Successful in 37s
Continuous integration / Rustfmt (push) Successful in 39s
Continuous integration / build (push) Successful in 52s
Continuous integration / Disallow unused dependencies (push) Successful in 2m9s
2025-04-23 15:25:34 -07:00
f11f0b4d23 server: migrate all use of log to tracing 2025-04-23 15:25:11 -07:00
c7c47e4a73 chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 38s
Continuous integration / Test Suite (push) Successful in 44s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / Rustfmt (push) Successful in 39s
Continuous integration / build (push) Successful in 51s
Continuous integration / Disallow unused dependencies (push) Successful in 2m5s
2025-04-23 14:57:39 -07:00
c3835522b2 server: add Letterbox/Bad label to unparsable emails, and consider them processed 2025-04-23 14:57:13 -07:00
dfa80f9046 chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 38s
Continuous integration / Test Suite (push) Successful in 55s
Continuous integration / Trunk (push) Successful in 52s
Continuous integration / Rustfmt (push) Successful in 39s
Continuous integration / Disallow unused dependencies (push) Successful in 58s
Continuous integration / build (push) Successful in 1m32s
2025-04-23 14:41:25 -07:00
b8dfdabf8d server: more tracing and logging 2025-04-23 14:41:11 -07:00
bbcf52b006 chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 38s
Continuous integration / Test Suite (push) Successful in 43s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / Trunk (push) Successful in 51s
Continuous integration / build (push) Successful in 51s
Continuous integration / Disallow unused dependencies (push) Successful in 2m5s
2025-04-23 11:38:48 -07:00
f92c05cd28 server: return ids processed from send_refresh_websocket_handler 2025-04-23 11:38:30 -07:00
885bbe0a8c chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 40s
Continuous integration / Test Suite (push) Successful in 45s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / Trunk (push) Successful in 52s
Continuous integration / build (push) Successful in 55s
Continuous integration / Disallow unused dependencies (push) Successful in 2m5s
2025-04-23 11:09:19 -07:00
8b1d111837 chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 38s
Continuous integration / Trunk (push) Successful in 37s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / Test Suite (push) Successful in 1m36s
Continuous integration / build (push) Successful in 51s
Continuous integration / Disallow unused dependencies (push) Successful in 1m59s
2025-04-23 11:02:46 -07:00
08abf31fa9 server: always remove unprocessed label when processing rules 2025-04-23 11:02:29 -07:00
fa99959508 chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 38s
Continuous integration / Test Suite (push) Successful in 46s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / Rustfmt (push) Successful in 38s
Continuous integration / build (push) Successful in 53s
Continuous integration / Disallow unused dependencies (push) Successful in 1m55s
2025-04-23 09:31:43 -07:00
0f6af0f475 server: more debug prints 2025-04-23 09:31:25 -07:00
4c486e9168 chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 38s
Continuous integration / Test Suite (push) Successful in 45s
Continuous integration / Rustfmt (push) Successful in 32s
Continuous integration / Trunk (push) Successful in 53s
Continuous integration / build (push) Successful in 51s
Continuous integration / Disallow unused dependencies (push) Successful in 2m4s
2025-04-22 22:43:37 -07:00
109d380ea7 server: remove inbox on no-match 2025-04-22 22:43:22 -07:00
4244fa0d82 chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 38s
Continuous integration / Test Suite (push) Successful in 45s
Continuous integration / Rustfmt (push) Successful in 33s
Continuous integration / Trunk (push) Successful in 51s
Continuous integration / build (push) Successful in 53s
Continuous integration / Disallow unused dependencies (push) Successful in 2m1s
2025-04-22 22:41:26 -07:00
4b15e71893 server: remove unprocessed appropriately 2025-04-22 22:41:09 -07:00
1bbebad01b chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 41s
Continuous integration / Test Suite (push) Successful in 45s
Continuous integration / Rustfmt (push) Successful in 32s
Continuous integration / Trunk (push) Successful in 51s
Continuous integration / build (push) Successful in 53s
Continuous integration / Disallow unused dependencies (push) Successful in 2m5s
2025-04-22 22:28:20 -07:00
27edffd090 Set version for all packages 2025-04-22 22:28:03 -07:00
08212a9f78 chore: Release 2025-04-22 22:26:17 -07:00
877ec6c4b0 server: drop version requirement 2025-04-22 22:26:03 -07:00
3ce92d6bdf chore: Release 2025-04-22 22:24:37 -07:00
1a28bb2021 Use path for notmuch crate 2025-04-22 22:24:07 -07:00
b86f72f75c chore: Release 2025-04-22 22:20:00 -07:00
1a8b98d420 Use relative import for notmuch 2025-04-22 22:19:45 -07:00
383a7d800f chore: Release 2025-04-22 22:18:50 -07:00
453561140a server: batch tag changes and add default Grey tag 2025-04-22 22:18:24 -07:00
f6d5d3755b chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 38s
Continuous integration / Test Suite (push) Successful in 46s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / Rustfmt (push) Successful in 36s
Continuous integration / build (push) Successful in 52s
Continuous integration / Disallow unused dependencies (push) Successful in 2m8s
2025-04-22 21:24:53 -07:00
5226fe090e server & web: run label_unprocessed before notifying web client 2025-04-22 21:22:50 -07:00
c10ad00ca7 chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 38s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / Test Suite (push) Successful in 1m18s
Continuous integration / Rustfmt (push) Successful in 32s
Continuous integration / Disallow unused dependencies (push) Successful in 56s
Continuous integration / build (push) Successful in 1m39s
2025-04-22 17:52:04 -07:00
64fc92c3d6 web: refresh including the server side on websocket reconnect 2025-04-22 17:51:53 -07:00
b9c116d5b6 server: mark spam as read 2025-04-22 17:51:53 -07:00
007200b37b fix(deps): update rust crate xtracing to v0.3.2
All checks were successful
Continuous integration / Check (push) Successful in 37s
Continuous integration / Test Suite (push) Successful in 44s
Continuous integration / Trunk (push) Successful in 37s
Continuous integration / Rustfmt (push) Successful in 39s
Continuous integration / build (push) Successful in 51s
Continuous integration / Disallow unused dependencies (push) Successful in 2m7s
2025-04-22 23:01:17 +00:00
9824ad1e18 chore(deps): lock file maintenance
All checks were successful
Continuous integration / Test Suite (push) Successful in 45s
Continuous integration / Check (push) Successful in 46s
Continuous integration / Trunk (push) Successful in 37s
Continuous integration / Rustfmt (push) Successful in 38s
Continuous integration / build (push) Successful in 51s
Continuous integration / Disallow unused dependencies (push) Successful in 2m7s
2025-04-22 15:16:24 +00:00
a8819c7551 gitea: use nightly when doing trunk build
All checks were successful
Continuous integration / Test Suite (push) Successful in 44s
Continuous integration / Check (push) Successful in 1m23s
Continuous integration / Rustfmt (push) Successful in 37s
Continuous integration / Trunk (push) Successful in 3m47s
Continuous integration / Disallow unused dependencies (push) Successful in 55s
Continuous integration / build (push) Successful in 3m45s
2025-04-22 08:13:38 -07:00
8cdfbdd08f chore: Release
Some checks failed
Continuous integration / build (push) Has been cancelled
Continuous integration / Disallow unused dependencies (push) Has been cancelled
Continuous integration / Rustfmt (push) Has been cancelled
Continuous integration / Trunk (push) Has been cancelled
Continuous integration / Test Suite (push) Has been cancelled
Continuous integration / Check (push) Has been cancelled
2025-04-22 07:59:42 -07:00
b2d1dc9276 cargo update && cargp upgrade 2025-04-22 07:59:12 -07:00
1f79b43a85 chore: Release
Some checks failed
Continuous integration / Check (push) Successful in 37s
Continuous integration / Test Suite (push) Successful in 44s
Continuous integration / Trunk (push) Failing after 36s
Continuous integration / Rustfmt (push) Successful in 38s
Continuous integration / build (push) Successful in 51s
Continuous integration / Disallow unused dependencies (push) Successful in 1m57s
2025-04-21 22:01:49 -07:00
904619bccd chore: Release 2025-04-21 22:01:41 -07:00
14104f6469 Remove non hermetic default flage values
Some checks failed
Continuous integration / Test Suite (push) Successful in 57s
Continuous integration / Trunk (push) Failing after 38s
Continuous integration / Check (push) Successful in 2m1s
Continuous integration / Rustfmt (push) Successful in 32s
Continuous integration / Disallow unused dependencies (push) Successful in 1m14s
Continuous integration / build (push) Successful in 1m37s
2025-04-21 21:59:22 -07:00
dccfb6f71f chore: Release
Some checks failed
Continuous integration / Check (push) Failing after 36s
Continuous integration / Test Suite (push) Failing after 43s
Continuous integration / Trunk (push) Failing after 36s
Continuous integration / Rustfmt (push) Successful in 37s
Continuous integration / build (push) Failing after 51s
Continuous integration / Disallow unused dependencies (push) Failing after 1m58s
2025-04-21 21:20:51 -07:00
547266a705 Fix imports for letterbox-* packages 2025-04-21 21:20:31 -07:00
273562b58c chore: Release 2025-04-21 21:16:43 -07:00
dc39eed1a7 cargo sqlx prepare 2025-04-21 21:16:42 -07:00
9178badfd0 Add mail tagging support 2025-04-21 21:15:55 -07:00
38e75ec251 web: make random emoji selection more deterministic 2025-04-21 10:12:12 -07:00
c1496bf87b server: doc cleanup 2025-04-20 10:48:59 -07:00
4da888b240 Move id format check from server into notmuch 2025-04-20 10:47:40 -07:00
c703be2ca5 server: more robust view original serving 2025-04-20 10:01:22 -07:00
5cec8add5e chore: Release
Some checks failed
Continuous integration / Check (push) Successful in 42s
Continuous integration / Trunk (push) Failing after 38s
Continuous integration / Test Suite (push) Successful in 1m20s
Continuous integration / Rustfmt (push) Successful in 32s
Continuous integration / build (push) Successful in 1m20s
Continuous integration / Disallow unused dependencies (push) Successful in 1m0s
2025-04-20 09:46:49 -07:00
0225dbde3a procmail2notmuch: don't run migration code, leave it to server 2025-04-20 09:46:27 -07:00
f84b8fa6c2 chore: Release 2025-04-20 09:38:35 -07:00
979cbcd23e procmail2notmuch: inlude early exit option 2025-04-20 09:37:51 -07:00
b3070e1919 web: use random emoji when search results empty, handle search vs catchup 2025-04-20 09:37:12 -07:00
e5fdde8f30 web: add graphic when search results are empty 2025-04-20 09:07:43 -07:00
7de36bbc3d procmail2notmuch: add sql rule loader 2025-04-20 08:40:06 -07:00
1c4f27902e server: add todo 2025-04-20 08:39:47 -07:00
7ee86f0d2f chore: Release
Some checks failed
Continuous integration / Check (push) Successful in 36s
Continuous integration / Test Suite (push) Successful in 41s
Continuous integration / Trunk (push) Failing after 35s
Continuous integration / Rustfmt (push) Successful in 38s
Continuous integration / build (push) Successful in 47s
Continuous integration / Disallow unused dependencies (push) Successful in 1m57s
2025-04-19 13:19:14 -07:00
a0b06fd5ef chore: Release 2025-04-19 13:17:01 -07:00
630bb20b35 procmail2notmuch: add debug vs notmuchrc modes 2025-04-19 13:16:47 -07:00
17ea2a35cb web: tweak style and behavior of view original link 2025-04-19 13:11:57 -07:00
7d9376d607 Add view original functionality 2025-04-19 12:33:11 -07:00
122e949072 chore: Release
Some checks failed
Continuous integration / Test Suite (push) Successful in 41s
Continuous integration / Check (push) Successful in 1m33s
Continuous integration / Trunk (push) Failing after 35s
Continuous integration / Rustfmt (push) Successful in 56s
Continuous integration / build (push) Successful in 48s
Continuous integration / Disallow unused dependencies (push) Successful in 3m14s
2025-04-16 08:48:35 -07:00
9a69b4c51e web: scroll to top on pagination 2025-04-16 08:47:45 -07:00
251151244b chore: Release
Some checks failed
Continuous integration / Check (push) Successful in 1m29s
Continuous integration / Test Suite (push) Successful in 41s
Continuous integration / Rustfmt (push) Successful in 30s
Continuous integration / Trunk (push) Failing after 1m9s
Continuous integration / build (push) Successful in 47s
Continuous integration / Disallow unused dependencies (push) Successful in 3m19s
2025-04-15 20:38:08 -07:00
9d232b666b server: add debug message for WS connection 2025-04-15 20:37:35 -07:00
1832d77e78 chore: Release
Some checks failed
Continuous integration / Check (push) Successful in 39s
Continuous integration / Test Suite (push) Successful in 41s
Continuous integration / Trunk (push) Failing after 35s
Continuous integration / build (push) Successful in 48s
Continuous integration / Rustfmt (push) Successful in 56s
Continuous integration / Disallow unused dependencies (push) Successful in 54s
2025-04-15 20:30:21 -07:00
aca6bce1ff web: connect to the correct ws endpoint in production 2025-04-15 20:30:02 -07:00
7bb2f405da chore: Release
Some checks failed
Continuous integration / Check (push) Successful in 36s
Continuous integration / Test Suite (push) Successful in 41s
Continuous integration / Trunk (push) Failing after 35s
Continuous integration / Rustfmt (push) Successful in 30s
Continuous integration / build (push) Successful in 48s
Continuous integration / Disallow unused dependencies (push) Successful in 3m8s
2025-04-15 19:33:55 -07:00
60e2824167 server: reenable per-account unread counts 2025-04-15 19:33:32 -07:00
cffc228b3a chore: Release
Some checks failed
Continuous integration / Check (push) Successful in 36s
Continuous integration / Test Suite (push) Successful in 41s
Continuous integration / Trunk (push) Failing after 35s
Continuous integration / Rustfmt (push) Successful in 32s
Continuous integration / build (push) Successful in 49s
Continuous integration / Disallow unused dependencies (push) Successful in 3m24s
2025-04-15 19:25:41 -07:00
318c366d82 server: disable per-email counts in tags, it's breaking production 2025-04-15 19:25:22 -07:00
90d7f79ca0 server: slow refresh interval as procmail should be on demand 2025-04-15 19:24:59 -07:00
3f87038776 web: proxy /notifcation 2025-04-15 18:39:36 -07:00
92b880f03b chore: Release
Some checks failed
Continuous integration / Check (push) Successful in 36s
Continuous integration / Test Suite (push) Successful in 41s
Continuous integration / Trunk (push) Failing after 35s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / build (push) Successful in 48s
Continuous integration / Disallow unused dependencies (push) Successful in 54s
2025-04-15 17:46:18 -07:00
94f1e84857 server: add notification handlers for refreshing mail and news 2025-04-15 17:45:47 -07:00
221b4f10df chore: Release
Some checks failed
Continuous integration / Check (push) Successful in 36s
Continuous integration / Test Suite (push) Successful in 41s
Continuous integration / Trunk (push) Failing after 35s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / build (push) Successful in 47s
Continuous integration / Disallow unused dependencies (push) Successful in 54s
2025-04-15 16:36:40 -07:00
225615f4ea server: move config to cmdline args 2025-04-15 16:36:19 -07:00
b8ef753f85 chore: Release
Some checks failed
Continuous integration / Check (push) Successful in 37s
Continuous integration / Test Suite (push) Successful in 41s
Continuous integration / Trunk (push) Failing after 35s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / build (push) Successful in 49s
Continuous integration / Disallow unused dependencies (push) Successful in 55s
2025-04-15 16:09:29 -07:00
33edd22f8f web: add mock wasm-socket for building on non-wasm 2025-04-15 16:09:19 -07:00
75e9232095 chore: Release 2025-04-15 16:09:19 -07:00
6daddf11de Remove unused dependencies 2025-04-15 16:09:19 -07:00
36d9eda303 chore: Release 2025-04-15 16:09:19 -07:00
4eb2d4c689 cargo sqlx prepare 2025-04-15 16:09:19 -07:00
edc7119fbf server: finish port to axum w/ websockets 2025-04-15 16:09:19 -07:00
aa1736a285 web: highlight button for current search, bring back debug unread 2025-04-15 16:09:19 -07:00
6f93aa4f34 server: poll for new messages and update clients via WS 2025-04-15 16:09:19 -07:00
0662e6230e server: instrument catchup 2025-04-15 16:09:19 -07:00
30f3f14040 web: plumb websocket messages through to UI 2025-04-15 16:09:19 -07:00
f2042f284e Add websocket handler on server, connect from client
Additionally add /test handler that triggers server->client WS message
2025-04-15 16:09:19 -07:00
b2c73ffa15 Try using axum instead of rocket. WS doesn't seem to work through trunk 2025-04-15 16:09:19 -07:00
d7217d1b3c WIP subscription support, will require switching webserver 2025-04-15 16:09:19 -07:00
638d55a36c web: prototype websocket client 2025-04-15 16:09:19 -07:00
b11f6b5149 fix(deps): update rust crate sqlx to v0.8.5
All checks were successful
Continuous integration / Check (push) Successful in 36s
Continuous integration / Test Suite (push) Successful in 41s
Continuous integration / Trunk (push) Successful in 37s
Continuous integration / Rustfmt (push) Successful in 30s
Continuous integration / build (push) Successful in 52s
Continuous integration / Disallow unused dependencies (push) Successful in 3m20s
2025-04-15 22:31:38 +00:00
d0b5ecf4f2 chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 36s
Continuous integration / Test Suite (push) Successful in 41s
Continuous integration / Trunk (push) Successful in 37s
Continuous integration / Rustfmt (push) Successful in 30s
Continuous integration / build (push) Successful in 48s
Continuous integration / Disallow unused dependencies (push) Successful in 3m28s
2025-04-14 08:40:18 -07:00
7a67c30a2c web: make search input larger and disable focus outline 2025-04-14 08:40:10 -07:00
5ea4694eb8 fix(deps): update rust crate sqlx to v0.8.4
All checks were successful
Continuous integration / Check (push) Successful in 40s
Continuous integration / Trunk (push) Successful in 37s
Continuous integration / Rustfmt (push) Successful in 30s
Continuous integration / build (push) Successful in 47s
Continuous integration / Test Suite (push) Successful in 2m44s
Continuous integration / Disallow unused dependencies (push) Successful in 56s
2025-04-14 05:16:45 +00:00
e01dabe6ed chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 36s
Continuous integration / Test Suite (push) Successful in 40s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / build (push) Successful in 48s
Continuous integration / Disallow unused dependencies (push) Successful in 57s
2025-04-13 22:01:29 -07:00
ecaf0dd0fc web: remove unused import 2025-04-13 22:01:17 -07:00
3d4dcc9e6b chore: Release 2025-04-13 20:53:47 -07:00
28a5d9f219 web: add buttons for just unread news and unread mail 2025-04-13 20:53:19 -07:00
81876d37ea web: fix click handling in news post header 2025-04-13 20:53:19 -07:00
4a6b159ddb web: always show bulk-edit checkbox, fix check logic 2025-04-13 20:53:19 -07:00
d84957cc8c web: use current thread, not first !seen in catchup mode 2025-04-13 20:53:19 -07:00
d53db5b49a chore(deps): lock file maintenance
All checks were successful
Continuous integration / Check (push) Successful in 1m5s
Continuous integration / Test Suite (push) Successful in 41s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / Trunk (push) Successful in 1m41s
Continuous integration / build (push) Successful in 48s
Continuous integration / Disallow unused dependencies (push) Successful in 3m16s
2025-04-14 00:46:58 +00:00
0448368011 chore(deps): lock file maintenance
All checks were successful
Continuous integration / Check (push) Successful in 36s
Continuous integration / Test Suite (push) Successful in 41s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / Rustfmt (push) Successful in 32s
Continuous integration / Disallow unused dependencies (push) Successful in 59s
Continuous integration / build (push) Successful in 4m41s
2025-04-14 00:02:00 +00:00
36754136fd chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 35s
Continuous integration / Test Suite (push) Successful in 40s
Continuous integration / Trunk (push) Successful in 37s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / build (push) Successful in 46s
Continuous integration / Disallow unused dependencies (push) Successful in 57s
2025-04-13 08:31:45 -07:00
489acccf77 web: force background color for code snippets 2025-04-13 08:31:20 -07:00
8ef4db63ad fix(deps): update rust crate clap to v4.5.36
All checks were successful
Continuous integration / Check (push) Successful in 37s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / Test Suite (push) Successful in 1m56s
Continuous integration / build (push) Successful in 47s
Continuous integration / Disallow unused dependencies (push) Successful in 3m44s
2025-04-11 20:46:39 +00:00
9f63205ff3 chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 37s
Continuous integration / Test Suite (push) Successful in 40s
Continuous integration / Trunk (push) Successful in 37s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / build (push) Successful in 47s
Continuous integration / Disallow unused dependencies (push) Successful in 57s
2025-04-10 12:35:10 -07:00
5a0378948d web: apply title wrapping on search results page 2025-04-10 12:32:46 -07:00
2b4c45be74 web: conditionally wrap title when large words found 2025-04-10 12:16:53 -07:00
147896dc80 chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 1m20s
Continuous integration / Test Suite (push) Successful in 39s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / build (push) Successful in 47s
Continuous integration / Disallow unused dependencies (push) Successful in 57s
Continuous integration / Trunk (push) Successful in 11m34s
2025-04-09 20:35:49 -07:00
1ff6ec7653 web: wrap long titles on message view 2025-04-09 20:35:33 -07:00
acd590111e chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 36s
Continuous integration / Test Suite (push) Successful in 41s
Continuous integration / Trunk (push) Successful in 45s
Continuous integration / Rustfmt (push) Successful in 54s
Continuous integration / build (push) Successful in 1m35s
Continuous integration / Disallow unused dependencies (push) Successful in 3m30s
2025-04-09 19:17:52 -07:00
b5f24ba1f2 server: strip element sizing attributes and inline style 2025-04-09 19:17:19 -07:00
79ed24135f fix(deps): update rust crate tantivy to 0.24.0
All checks were successful
Continuous integration / Test Suite (push) Successful in 40s
Continuous integration / Check (push) Successful in 1m19s
Continuous integration / Trunk (push) Successful in 37s
Continuous integration / Rustfmt (push) Successful in 37s
Continuous integration / build (push) Successful in 48s
Continuous integration / Disallow unused dependencies (push) Successful in 2m3s
2025-04-09 18:01:42 +00:00
a4949a25b5 fix(deps): update rust crate cacher to 0.2.0
All checks were successful
Continuous integration / Check (push) Successful in 35s
Continuous integration / Test Suite (push) Successful in 40s
Continuous integration / Trunk (push) Successful in 37s
Continuous integration / Rustfmt (push) Successful in 44s
Continuous integration / build (push) Successful in 47s
Continuous integration / Disallow unused dependencies (push) Successful in 2m21s
2025-04-07 03:46:21 +00:00
f16edef124 chore(deps): lock file maintenance
All checks were successful
Continuous integration / Check (push) Successful in 35s
Continuous integration / Trunk (push) Successful in 37s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / build (push) Successful in 1m6s
Continuous integration / Test Suite (push) Successful in 3m2s
Continuous integration / Disallow unused dependencies (push) Successful in 56s
2025-04-07 00:01:51 +00:00
2fd6479cb9 fix(deps): update rust crate tokio to v1.44.2
All checks were successful
Continuous integration / Test Suite (push) Successful in 1m15s
Continuous integration / Trunk (push) Successful in 37s
Continuous integration / Rustfmt (push) Successful in 30s
Continuous integration / build (push) Successful in 45s
Continuous integration / Check (push) Successful in 4m17s
Continuous integration / Disallow unused dependencies (push) Successful in 57s
2025-04-05 15:47:48 +00:00
85a6b3a9a4 chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 36s
Continuous integration / Test Suite (push) Successful in 40s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / Rustfmt (push) Successful in 38s
Continuous integration / build (push) Successful in 48s
Continuous integration / Disallow unused dependencies (push) Successful in 2m6s
2025-04-02 16:53:57 -07:00
9ac5216d6e web: more pre/code css tweaks 2025-04-02 16:53:37 -07:00
82987dbd20 web: tweak stype of code blocks 2025-04-02 16:46:24 -07:00
29de7c0727 chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 36s
Continuous integration / Test Suite (push) Successful in 40s
Continuous integration / Trunk (push) Successful in 37s
Continuous integration / Rustfmt (push) Successful in 37s
Continuous integration / build (push) Successful in 50s
Continuous integration / Disallow unused dependencies (push) Successful in 2m22s
2025-04-02 13:27:18 -07:00
5f6580fa2f web: remove unreachable code 2025-04-02 13:27:02 -07:00
5d4732d75d chore: Release 2025-04-02 12:22:29 -07:00
a13bac813a web: make money stuff mobile friendly 2025-04-02 12:21:54 -07:00
85dcc9f7bd fix(deps): update rust crate clap to v4.5.35
All checks were successful
Continuous integration / Test Suite (push) Successful in 43s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / Check (push) Successful in 1m24s
Continuous integration / Rustfmt (push) Successful in 32s
Continuous integration / build (push) Successful in 1m20s
Continuous integration / Disallow unused dependencies (push) Successful in 56s
2025-04-01 17:31:11 +00:00
b696629ad9 chore(deps): lock file maintenance
All checks were successful
Continuous integration / Check (push) Successful in 37s
Continuous integration / Test Suite (push) Successful in 42s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / Rustfmt (push) Successful in 32s
Continuous integration / Disallow unused dependencies (push) Successful in 57s
Continuous integration / build (push) Successful in 1m29s
2025-03-30 23:46:58 +00:00
b9e3128718 fix(deps): update all non-major dependencies
All checks were successful
Continuous integration / Check (push) Successful in 1m17s
Continuous integration / Test Suite (push) Successful in 39s
Continuous integration / Rustfmt (push) Successful in 32s
Continuous integration / Trunk (push) Successful in 1m4s
Continuous integration / build (push) Successful in 49s
Continuous integration / Disallow unused dependencies (push) Successful in 2m33s
2025-03-30 23:17:15 +00:00
88fac4c2bc chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 38s
Continuous integration / Test Suite (push) Successful in 39s
Continuous integration / Trunk (push) Successful in 37s
Continuous integration / Rustfmt (push) Successful in 45s
Continuous integration / build (push) Successful in 50s
Continuous integration / Disallow unused dependencies (push) Successful in 2m28s
2025-03-30 16:10:01 -07:00
1fad5ec536 server: remove unused dep opentelemetry 2025-03-30 16:09:42 -07:00
8e7214d531 chore: Release
All checks were successful
Continuous integration / Test Suite (push) Successful in 40s
Continuous integration / Check (push) Successful in 1m3s
Continuous integration / Trunk (push) Successful in 39s
Continuous integration / Rustfmt (push) Successful in 38s
Continuous integration / build (push) Successful in 49s
Continuous integration / Disallow unused dependencies (push) Successful in 2m4s
2025-03-30 11:18:44 -07:00
333c4a3ebb server: rewrite old nzbfinder download links 2025-03-30 11:18:19 -07:00
b9ba5a3bea fix(deps): update all non-major dependencies
All checks were successful
Continuous integration / Check (push) Successful in 55s
Continuous integration / Test Suite (push) Successful in 39s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / Rustfmt (push) Successful in 55s
Continuous integration / build (push) Successful in 1m19s
Continuous integration / Disallow unused dependencies (push) Successful in 2m46s
2025-03-20 05:31:31 +00:00
2a0989e74d chore(deps): lock file maintenance
All checks were successful
Continuous integration / Check (push) Successful in 38s
Continuous integration / Trunk (push) Successful in 53s
Continuous integration / Rustfmt (push) Successful in 32s
Continuous integration / build (push) Successful in 51s
Continuous integration / Disallow unused dependencies (push) Successful in 56s
Continuous integration / Test Suite (push) Successful in 4m22s
2025-03-17 00:01:34 +00:00
e9319dc491 fix(deps): update rust crate async-trait to v0.1.88
All checks were successful
Continuous integration / Test Suite (push) Successful in 48s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / Rustfmt (push) Successful in 34s
Continuous integration / build (push) Successful in 47s
Continuous integration / Check (push) Successful in 3m43s
Continuous integration / Disallow unused dependencies (push) Successful in 59s
2025-03-15 01:16:46 +00:00
57481a77cd fix(deps): update rust crate uuid to v1.16.0
All checks were successful
Continuous integration / Check (push) Successful in 36s
Continuous integration / Test Suite (push) Successful in 39s
Continuous integration / Trunk (push) Successful in 37s
Continuous integration / build (push) Successful in 48s
Continuous integration / Rustfmt (push) Successful in 1m10s
Continuous integration / Disallow unused dependencies (push) Successful in 57s
2025-03-14 04:31:07 +00:00
44915cce54 fix(deps): update rust crate tokio to v1.44.1
All checks were successful
Continuous integration / Test Suite (push) Successful in 40s
Continuous integration / Trunk (push) Successful in 39s
Continuous integration / Rustfmt (push) Successful in 32s
Continuous integration / Check (push) Successful in 2m26s
Continuous integration / build (push) Successful in 47s
Continuous integration / Disallow unused dependencies (push) Successful in 3m26s
2025-03-13 08:31:33 +00:00
1225483b57 chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 35s
Continuous integration / Test Suite (push) Successful in 40s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / Rustfmt (push) Successful in 32s
Continuous integration / Disallow unused dependencies (push) Successful in 56s
Continuous integration / build (push) Successful in 5m37s
2025-03-12 16:44:04 -07:00
daeb8c88a1 server: recover on slurp fetch failures 2025-03-12 16:43:48 -07:00
8a6b3ff501 chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 36s
Continuous integration / Test Suite (push) Successful in 39s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / build (push) Successful in 46s
Continuous integration / Trunk (push) Successful in 12m56s
Continuous integration / Disallow unused dependencies (push) Successful in 4m0s
2025-03-12 13:53:27 -07:00
a6fffeafdc web: change autoreload logic 2025-03-12 13:53:11 -07:00
d791b4ce49 chore: Release 2025-03-12 13:50:45 -07:00
8a0e4eb441 web: log all state changes and don't autoreload on error, causes infini-loop 2025-03-12 13:50:39 -07:00
fc84562419 fix(deps): update rust crate reqwest to v0.12.14
All checks were successful
Continuous integration / Test Suite (push) Successful in 40s
Continuous integration / Rustfmt (push) Successful in 32s
Continuous integration / build (push) Successful in 46s
Continuous integration / Disallow unused dependencies (push) Successful in 56s
Continuous integration / Check (push) Successful in 5m27s
Continuous integration / Trunk (push) Successful in 8m3s
2025-03-12 13:46:26 +00:00
37ebe1ebb3 fix(deps): update rust crate reqwest to v0.12.13
All checks were successful
Continuous integration / Test Suite (push) Successful in 1m14s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / Check (push) Successful in 2m30s
Continuous integration / Rustfmt (push) Successful in 32s
Continuous integration / Disallow unused dependencies (push) Successful in 56s
Continuous integration / build (push) Successful in 2m17s
2025-03-11 20:47:18 +00:00
2d06f070ea chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 42s
Continuous integration / Test Suite (push) Successful in 51s
Continuous integration / Trunk (push) Successful in 40s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / Disallow unused dependencies (push) Successful in 56s
Continuous integration / build (push) Successful in 15m50s
2025-03-10 19:38:57 -07:00
527a62069a Revert "web: center contents in cacthup mode"
This reverts commit 1411961e36.
2025-03-10 19:38:32 -07:00
40afafe1a8 fix(deps): update rust crate clap to v4.5.32
All checks were successful
Continuous integration / Test Suite (push) Successful in 55s
Continuous integration / Trunk (push) Successful in 44s
Continuous integration / Rustfmt (push) Successful in 32s
Continuous integration / build (push) Successful in 1m2s
Continuous integration / Disallow unused dependencies (push) Successful in 56s
Continuous integration / Check (push) Successful in 6m34s
2025-03-10 21:01:24 +00:00
e3acf9ae6d chore(deps): lock file maintenance
All checks were successful
Continuous integration / Check (push) Successful in 43s
Continuous integration / Test Suite (push) Successful in 52s
Continuous integration / Trunk (push) Successful in 54s
Continuous integration / build (push) Successful in 1m0s
Continuous integration / Rustfmt (push) Successful in 1m21s
Continuous integration / Disallow unused dependencies (push) Successful in 57s
2025-03-10 00:05:51 +00:00
a68d067a68 fix(deps): update rust crate serde to v1.0.219
All checks were successful
Continuous integration / Check (push) Successful in 42s
Continuous integration / Test Suite (push) Successful in 49s
Continuous integration / Trunk (push) Successful in 44s
Continuous integration / Rustfmt (push) Successful in 55s
Continuous integration / build (push) Successful in 55s
Continuous integration / Disallow unused dependencies (push) Successful in 2m48s
2025-03-09 20:01:48 +00:00
5547c65af0 fix(deps): update rust crate tokio to v1.44.0
All checks were successful
Continuous integration / Check (push) Successful in 40s
Continuous integration / Test Suite (push) Successful in 1m1s
Continuous integration / Trunk (push) Successful in 1m12s
Continuous integration / Rustfmt (push) Successful in 52s
Continuous integration / Disallow unused dependencies (push) Successful in 1m21s
Continuous integration / build (push) Successful in 19m6s
2025-03-09 16:24:42 +00:00
b622bb7d7d chore: Release
All checks were successful
Continuous integration / Test Suite (push) Successful in 47s
Continuous integration / Check (push) Successful in 6m14s
Continuous integration / Trunk (push) Successful in 41s
Continuous integration / build (push) Successful in 54s
Continuous integration / Rustfmt (push) Successful in 1m43s
Continuous integration / Disallow unused dependencies (push) Successful in 58s
2025-03-08 07:57:33 -08:00
43efdf18a0 web: reload page on fetch error. Should help with expired cookies 2025-03-08 07:57:12 -08:00
c71ab8e9e8 chore: Release 2025-03-08 07:52:40 -08:00
408d6ed8ba web: only reload on version skew in release 2025-03-08 07:52:03 -08:00
1411961e36 web: center contents in cacthup mode 2025-03-08 07:52:03 -08:00
dfd7ef466c Only rebuild on push 2025-03-08 07:52:03 -08:00
2aa3dfbd0f fix(deps): update rust crate serde_json to v1.0.140
All checks were successful
Continuous integration / Test Suite (pull_request) Successful in 46s
Continuous integration / Check (pull_request) Successful in 2m2s
Continuous integration / Trunk (pull_request) Successful in 38s
Continuous integration / build (pull_request) Successful in 52s
Continuous integration / Rustfmt (pull_request) Successful in 1m16s
Continuous integration / Disallow unused dependencies (pull_request) Successful in 55s
Continuous integration / Check (push) Successful in 38s
Continuous integration / Trunk (push) Successful in 39s
Continuous integration / Rustfmt (push) Successful in 30s
Continuous integration / build (push) Successful in 52s
Continuous integration / Disallow unused dependencies (push) Successful in 56s
Continuous integration / Test Suite (push) Successful in 4m26s
2025-03-03 09:46:00 +00:00
fba10e27cf fix(deps): update all non-major dependencies
All checks were successful
Continuous integration / Check (pull_request) Successful in 38s
Continuous integration / Test Suite (pull_request) Successful in 46s
Continuous integration / Trunk (pull_request) Successful in 40s
Continuous integration / Rustfmt (pull_request) Successful in 31s
Continuous integration / Disallow unused dependencies (pull_request) Successful in 55s
Continuous integration / build (pull_request) Successful in 3m24s
Continuous integration / Test Suite (push) Successful in 45s
Continuous integration / Trunk (push) Successful in 39s
Continuous integration / Rustfmt (push) Successful in 32s
Continuous integration / Check (push) Successful in 2m43s
Continuous integration / build (push) Successful in 57s
Continuous integration / Disallow unused dependencies (push) Successful in 2m45s
2025-03-03 06:03:25 +00:00
5417c74f9c fix(deps): update rust crate thiserror to v2.0.12
All checks were successful
Continuous integration / Check (pull_request) Successful in 43s
Continuous integration / Test Suite (pull_request) Successful in 46s
Continuous integration / Trunk (pull_request) Successful in 39s
Continuous integration / Rustfmt (pull_request) Successful in 32s
Continuous integration / Disallow unused dependencies (pull_request) Successful in 57s
Continuous integration / build (pull_request) Successful in 2m36s
Continuous integration / Check (push) Successful in 42s
Continuous integration / Test Suite (push) Successful in 45s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / build (push) Successful in 57s
Continuous integration / Trunk (push) Successful in 2m22s
Continuous integration / Disallow unused dependencies (push) Successful in 57s
2025-03-03 04:46:31 +00:00
eb0b0dbe81 chore(deps): lock file maintenance
All checks were successful
Continuous integration / Test Suite (pull_request) Successful in 45s
Continuous integration / Trunk (pull_request) Successful in 38s
Continuous integration / Rustfmt (pull_request) Successful in 30s
Continuous integration / Check (pull_request) Successful in 3m6s
Continuous integration / build (pull_request) Successful in 56s
Continuous integration / Disallow unused dependencies (pull_request) Successful in 2m21s
Continuous integration / Test Suite (push) Successful in 49s
Continuous integration / Trunk (push) Successful in 39s
Continuous integration / Check (push) Successful in 2m37s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / Disallow unused dependencies (push) Successful in 56s
Continuous integration / build (push) Successful in 4m9s
2025-03-03 00:01:36 +00:00
561f522658 fix(deps): update rust crate mailparse to v0.16.1
All checks were successful
Continuous integration / Check (pull_request) Successful in 38s
Continuous integration / Test Suite (pull_request) Successful in 44s
Continuous integration / Trunk (pull_request) Successful in 39s
Continuous integration / Rustfmt (pull_request) Successful in 32s
Continuous integration / Disallow unused dependencies (pull_request) Successful in 56s
Continuous integration / build (pull_request) Successful in 2m50s
Continuous integration / Test Suite (push) Successful in 44s
Continuous integration / Trunk (push) Successful in 39s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / Check (push) Successful in 2m4s
Continuous integration / build (push) Successful in 51s
Continuous integration / Disallow unused dependencies (push) Successful in 2m40s
2025-02-27 23:33:39 +00:00
32d2ffeb3d chore: Release
All checks were successful
Continuous integration / Test Suite (push) Successful in 44s
Continuous integration / Trunk (push) Successful in 41s
Continuous integration / Rustfmt (push) Successful in 32s
Continuous integration / Check (push) Successful in 2m37s
Continuous integration / build (push) Successful in 56s
Continuous integration / Disallow unused dependencies (push) Successful in 2m36s
2025-02-27 15:16:09 -08:00
d41946e0a5 web: change style for mark read catchup button 2025-02-27 15:15:49 -08:00
61402858f4 web: add TODO 2025-02-27 15:15:42 -08:00
17de318645 chore: Release
All checks were successful
Continuous integration / Test Suite (push) Successful in 47s
Continuous integration / Check (push) Successful in 2m0s
Continuous integration / Trunk (push) Successful in 39s
Continuous integration / build (push) Successful in 52s
Continuous integration / Rustfmt (push) Successful in 1m6s
Continuous integration / Disallow unused dependencies (push) Successful in 56s
2025-02-26 15:43:34 -08:00
3aa0144e8d web: try setting history.scroll_restoration to manual to impove inter-page flow 2025-02-26 15:43:18 -08:00
f9eafff4c7 web: add "go home" button to catchup view 2025-02-26 15:43:18 -08:00
4c6d67901d fix(deps): update rust crate uuid to v1.15.1
All checks were successful
Continuous integration / Check (pull_request) Successful in 38s
Continuous integration / Test Suite (pull_request) Successful in 43s
Continuous integration / Trunk (pull_request) Successful in 39s
Continuous integration / Rustfmt (pull_request) Successful in 32s
Continuous integration / Disallow unused dependencies (pull_request) Successful in 55s
Continuous integration / build (pull_request) Successful in 3m42s
Continuous integration / Check (push) Successful in 37s
Continuous integration / Test Suite (push) Successful in 45s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / build (push) Successful in 50s
Continuous integration / Trunk (push) Successful in 1m39s
Continuous integration / Disallow unused dependencies (push) Successful in 55s
2025-02-26 21:15:57 +00:00
e9aa97a089 fix(deps): update rust crate chrono to v0.4.40
All checks were successful
Continuous integration / Check (pull_request) Successful in 37s
Continuous integration / Test Suite (pull_request) Successful in 43s
Continuous integration / Trunk (pull_request) Successful in 38s
Continuous integration / build (pull_request) Successful in 51s
Continuous integration / Rustfmt (pull_request) Successful in 1m4s
Continuous integration / Disallow unused dependencies (pull_request) Successful in 55s
Continuous integration / Check (push) Successful in 37s
Continuous integration / Test Suite (push) Successful in 43s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / build (push) Successful in 50s
Continuous integration / Trunk (push) Successful in 1m37s
Continuous integration / Disallow unused dependencies (push) Successful in 55s
2025-02-26 08:46:20 +00:00
a82b047f75 fix(deps): update rust crate uuid to v1.15.0
All checks were successful
Continuous integration / Check (pull_request) Successful in 39s
Continuous integration / Test Suite (pull_request) Successful in 43s
Continuous integration / Trunk (pull_request) Successful in 38s
Continuous integration / Rustfmt (pull_request) Successful in 57s
Continuous integration / build (pull_request) Successful in 50s
Continuous integration / Disallow unused dependencies (pull_request) Successful in 2m27s
Continuous integration / Check (push) Successful in 38s
Continuous integration / Test Suite (push) Successful in 43s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / build (push) Successful in 49s
Continuous integration / Trunk (push) Successful in 1m31s
Continuous integration / Disallow unused dependencies (push) Successful in 56s
2025-02-26 06:16:01 +00:00
9a8b44a8df fix(deps): update all non-major dependencies to 0.0.40
All checks were successful
Continuous integration / Test Suite (pull_request) Successful in 44s
Continuous integration / Check (pull_request) Successful in 1m48s
Continuous integration / Trunk (pull_request) Successful in 38s
Continuous integration / build (pull_request) Successful in 50s
Continuous integration / Rustfmt (pull_request) Successful in 1m3s
Continuous integration / Disallow unused dependencies (pull_request) Successful in 1m0s
Continuous integration / Test Suite (push) Successful in 43s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / Check (push) Successful in 1m49s
Continuous integration / Rustfmt (push) Successful in 30s
Continuous integration / Disallow unused dependencies (push) Successful in 54s
Continuous integration / build (push) Successful in 2m43s
2025-02-26 04:47:10 +00:00
a96693004c chore: Release
All checks were successful
Continuous integration / Test Suite (push) Successful in 42s
Continuous integration / Check (push) Successful in 2m9s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / build (push) Successful in 50s
Continuous integration / Rustfmt (push) Successful in 1m7s
Continuous integration / Disallow unused dependencies (push) Successful in 56s
2025-02-25 20:43:47 -08:00
ed9fe11fbf web: trimmed views for catchup mode 2025-02-25 20:43:27 -08:00
09fb14a796 chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 37s
Continuous integration / Test Suite (push) Successful in 43s
Continuous integration / Trunk (push) Successful in 37s
Continuous integration / Rustfmt (push) Successful in 52s
Continuous integration / build (push) Successful in 50s
Continuous integration / Disallow unused dependencies (push) Successful in 2m28s
2025-02-25 20:08:44 -08:00
58a7936bba web: address lint 2025-02-25 20:08:31 -08:00
cd0ee361f5 chore: Release 2025-02-25 20:06:18 -08:00
77bd5abe0d Don't do incremental builds when release 2025-02-25 20:06:11 -08:00
450c5496b3 chore: Release 2025-02-25 20:04:01 -08:00
4411e45a3c Don't allow warnings when publishing 2025-02-25 20:03:40 -08:00
e7d20896d5 web: remove unnecessary Msg variant 2025-02-25 16:20:32 -08:00
32a1115abd chore: Release
Some checks failed
Continuous integration / Check (push) Successful in 38s
Continuous integration / Test Suite (push) Successful in 45s
Continuous integration / Trunk (push) Failing after 36s
Continuous integration / Rustfmt (push) Successful in 30s
Continuous integration / Disallow unused dependencies (push) Successful in 54s
Continuous integration / build (push) Successful in 2m44s
2025-02-25 15:58:46 -08:00
4982057500 web: more scroll to top improvements by reworking URL changes 2025-02-25 15:58:24 -08:00
8977f8bab5 chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 40s
Continuous integration / Test Suite (push) Successful in 43s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / Rustfmt (push) Successful in 30s
Continuous integration / build (push) Successful in 49s
Continuous integration / Disallow unused dependencies (push) Successful in 2m39s
2025-02-25 13:51:38 -08:00
0962a6b3cf web: improve scroll-to-top behavior 2025-02-25 13:51:11 -08:00
3c72929a4f web: enable properly styled buttons 2025-02-25 10:26:16 -08:00
e4eb495a70 web: properly exit catchup mode when done 2025-02-25 10:25:28 -08:00
00e8b0342e chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 37s
Continuous integration / Test Suite (push) Successful in 41s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / Rustfmt (push) Successful in 30s
Continuous integration / Disallow unused dependencies (push) Successful in 54s
Continuous integration / build (push) Successful in 2m46s
2025-02-24 18:41:19 -08:00
b1f9867c06 web: remove debug statement 2025-02-24 18:41:00 -08:00
77943b3570 web: scroll to top on page changes 2025-02-24 18:39:47 -08:00
45e4edb1dd web: add icons to catchup controls 2025-02-24 17:09:16 -08:00
9bf53afebf server: sort catchup ids by timestamp across all sources 2025-02-24 17:08:57 -08:00
e1a502ac4b chore: Release
All checks were successful
Continuous integration / Test Suite (push) Successful in 44s
Continuous integration / Check (push) Successful in 2m1s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / Rustfmt (push) Successful in 1m5s
Continuous integration / build (push) Successful in 50s
Continuous integration / Disallow unused dependencies (push) Successful in 2m38s
2025-02-24 14:56:17 -08:00
9346c46e62 web: change exit catchup behavior to view current message 2025-02-24 14:55:51 -08:00
1452746305 chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 47s
Continuous integration / Test Suite (push) Successful in 44s
Continuous integration / Trunk (push) Successful in 39s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / Disallow unused dependencies (push) Successful in 56s
Continuous integration / build (push) Successful in 2m43s
2025-02-24 14:38:44 -08:00
2e526dace1 Implement catchup mode
Show original/delivered To if no xinu.tv addresses in To/CC fields
2025-02-24 14:38:18 -08:00
76be5b7cac fix(deps): update rust crate clap to v4.5.31
All checks were successful
Continuous integration / Check (pull_request) Successful in 39s
Continuous integration / Test Suite (pull_request) Successful in 43s
Continuous integration / Trunk (pull_request) Successful in 38s
Continuous integration / Rustfmt (pull_request) Successful in 32s
Continuous integration / Disallow unused dependencies (pull_request) Successful in 56s
Continuous integration / build (pull_request) Successful in 2m52s
Continuous integration / Check (push) Successful in 38s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / Test Suite (push) Successful in 1m45s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / Disallow unused dependencies (push) Successful in 55s
Continuous integration / build (push) Successful in 3m25s
2025-02-24 16:00:55 +00:00
3f0b2caedf fix(deps): update rust crate scraper to 0.23.0
All checks were successful
Continuous integration / Check (pull_request) Successful in 38s
Continuous integration / Test Suite (pull_request) Successful in 44s
Continuous integration / Trunk (pull_request) Successful in 39s
Continuous integration / Rustfmt (pull_request) Successful in 32s
Continuous integration / Disallow unused dependencies (pull_request) Successful in 57s
Continuous integration / build (pull_request) Successful in 2m46s
Continuous integration / Check (push) Successful in 39s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / build (push) Successful in 51s
Continuous integration / Test Suite (push) Successful in 3m28s
Continuous integration / Disallow unused dependencies (push) Successful in 56s
2025-02-24 09:31:24 +00:00
ec6dc35ca8 chore(deps): lock file maintenance
All checks were successful
Continuous integration / Check (pull_request) Successful in 43s
Continuous integration / Test Suite (pull_request) Successful in 47s
Continuous integration / Trunk (pull_request) Successful in 39s
Continuous integration / Rustfmt (pull_request) Successful in 32s
Continuous integration / Disallow unused dependencies (pull_request) Successful in 1m2s
Continuous integration / build (pull_request) Successful in 3m44s
Continuous integration / Check (push) Successful in 39s
Continuous integration / Test Suite (push) Successful in 46s
Continuous integration / Trunk (push) Successful in 39s
Continuous integration / Rustfmt (push) Successful in 50s
Continuous integration / build (push) Successful in 54s
Continuous integration / Disallow unused dependencies (push) Successful in 3m23s
2025-02-24 00:01:18 +00:00
01e1ca927e chore: Release
All checks were successful
Continuous integration / Test Suite (push) Successful in 44s
Continuous integration / Check (push) Successful in 2m0s
Continuous integration / Trunk (push) Successful in 40s
Continuous integration / Rustfmt (push) Successful in 1m0s
Continuous integration / build (push) Successful in 51s
Continuous integration / Disallow unused dependencies (push) Successful in 2m34s
2025-02-23 11:47:04 -08:00
1cc52d6c96 web: show X-Original-To: if To: is missing, fallback to Delivered-To: 2025-02-23 11:46:21 -08:00
e6b3a5b5a9 notmuch & server: plumb Delivered-To and X-Original-To headers 2025-02-23 09:37:09 -08:00
bc4b15a5aa chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 38s
Continuous integration / Test Suite (push) Successful in 42s
Continuous integration / Trunk (push) Successful in 39s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / Disallow unused dependencies (push) Successful in 55s
Continuous integration / build (push) Successful in 2m40s
2025-02-22 17:58:37 -08:00
00f61cf6be server: recursively descend email threads to find all unread recipients 2025-02-22 17:58:07 -08:00
52e24437bd chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 38s
Continuous integration / Test Suite (push) Successful in 45s
Continuous integration / Trunk (push) Successful in 39s
Continuous integration / Rustfmt (push) Successful in 51s
Continuous integration / build (push) Successful in 51s
Continuous integration / Disallow unused dependencies (push) Successful in 2m25s
2025-02-22 17:27:54 -08:00
393ffc8506 notmuch: normalize unread_recipients to lower case 2025-02-22 17:27:30 -08:00
2b6cb6ec6e chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 42s
Continuous integration / Test Suite (push) Successful in 44s
Continuous integration / Trunk (push) Successful in 39s
Continuous integration / Rustfmt (push) Successful in 58s
Continuous integration / build (push) Successful in 53s
Continuous integration / Disallow unused dependencies (push) Successful in 2m37s
2025-02-22 17:24:31 -08:00
0cba3a624c web: add de/select all checkbox with tristate 2025-02-22 17:24:18 -08:00
73433711ca fix(deps): update rust crate xtracing to 0.3.0
All checks were successful
Continuous integration / Check (pull_request) Successful in 38s
Continuous integration / Test Suite (pull_request) Successful in 43s
Continuous integration / Rustfmt (pull_request) Successful in 31s
Continuous integration / build (pull_request) Successful in 51s
Continuous integration / Trunk (pull_request) Successful in 2m32s
Continuous integration / Disallow unused dependencies (pull_request) Successful in 57s
Continuous integration / Test Suite (push) Successful in 49s
Continuous integration / Trunk (push) Successful in 40s
Continuous integration / Check (push) Successful in 1m54s
Continuous integration / Rustfmt (push) Successful in 32s
Continuous integration / Disallow unused dependencies (push) Successful in 59s
Continuous integration / build (push) Successful in 2m28s
2025-02-23 00:02:30 +00:00
965afa6871 Merge pull request 'fix(deps): update rust crate seed_hooks to 0.4.0' (#48) from renovate/seed_hooks-0.x into master
All checks were successful
Continuous integration / Test Suite (push) Successful in 44s
Continuous integration / Check (push) Successful in 1m45s
Continuous integration / Trunk (push) Successful in 40s
Continuous integration / Rustfmt (push) Successful in 1m3s
Continuous integration / build (push) Successful in 53s
Continuous integration / Disallow unused dependencies (push) Successful in 2m33s
Reviewed-on: #48
2025-02-22 15:49:50 -08:00
e70dbaf917 fix(deps): update rust crate seed_hooks to 0.4.0
All checks were successful
Continuous integration / Check (push) Successful in 39s
Continuous integration / Test Suite (push) Successful in 44s
Continuous integration / Rustfmt (push) Successful in 32s
Continuous integration / Trunk (push) Successful in 1m23s
Continuous integration / build (push) Successful in 51s
Continuous integration / Check (pull_request) Successful in 38s
Continuous integration / Test Suite (pull_request) Successful in 44s
Continuous integration / Disallow unused dependencies (push) Successful in 2m38s
Continuous integration / Trunk (pull_request) Successful in 38s
Continuous integration / build (pull_request) Successful in 50s
Continuous integration / Rustfmt (pull_request) Successful in 55s
Continuous integration / Disallow unused dependencies (pull_request) Successful in 57s
2025-02-22 15:18:33 -08:00
6b4ce11743 fix(deps): update rust crate xtracing to v0.2.1
All checks were successful
Continuous integration / Check (pull_request) Successful in 37s
Continuous integration / Test Suite (pull_request) Successful in 43s
Continuous integration / Trunk (pull_request) Successful in 39s
Continuous integration / Rustfmt (pull_request) Successful in 31s
Continuous integration / Disallow unused dependencies (pull_request) Successful in 56s
Continuous integration / build (pull_request) Successful in 2m38s
Continuous integration / Check (push) Successful in 38s
Continuous integration / Trunk (push) Successful in 39s
Continuous integration / Rustfmt (push) Successful in 32s
Continuous integration / build (push) Successful in 49s
Continuous integration / Disallow unused dependencies (push) Successful in 57s
Continuous integration / Test Suite (push) Successful in 3m44s
2025-02-22 22:31:55 +00:00
d1980a55a7 fix(deps): update rust crate cacher to v0.1.5
All checks were successful
Continuous integration / Check (pull_request) Successful in 36s
Continuous integration / Test Suite (pull_request) Successful in 42s
Continuous integration / Trunk (pull_request) Successful in 38s
Continuous integration / build (pull_request) Successful in 50s
Continuous integration / Rustfmt (pull_request) Successful in 52s
Continuous integration / Disallow unused dependencies (pull_request) Successful in 56s
Continuous integration / Test Suite (push) Successful in 44s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / Check (push) Successful in 2m4s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / Disallow unused dependencies (push) Successful in 55s
Continuous integration / build (push) Successful in 2m41s
2025-02-22 21:16:46 +00:00
8b78b39d4c chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 37s
Continuous integration / Test Suite (push) Successful in 42s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / Disallow unused dependencies (push) Successful in 55s
Continuous integration / build (push) Successful in 2m29s
2025-02-22 13:10:03 -08:00
ae17651eb5 Normalize Justfile config 2025-02-22 13:08:15 -08:00
22fd8409f6 chore: Release
All checks were successful
Continuous integration / Check (push) Successful in 40s
Continuous integration / Test Suite (push) Successful in 42s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / Rustfmt (push) Successful in 52s
Continuous integration / build (push) Successful in 50s
Continuous integration / Disallow unused dependencies (push) Successful in 2m21s
2025-02-22 12:41:57 -08:00
d0a4ba417f chore: Release 2025-02-22 12:41:30 -08:00
7b09b098a4 chore: Release 2025-02-22 12:41:15 -08:00
bd4c10a8fb Specify registry for all letterbox-* deps 2025-02-22 12:41:15 -08:00
ed3c5f152e chore: Release 2025-02-22 12:41:15 -08:00
63232d1e92 Publish only to xinu 2025-02-22 12:41:15 -08:00
4a3eba80d5 chore: Release 2025-02-22 12:41:15 -08:00
71d3745342 Try relative paths for letterbox-* deps 2025-02-22 12:41:14 -08:00
5fdc98633d chore: Release 2025-02-22 12:39:39 -08:00
57877f268d Set repository in workspace 2025-02-22 12:39:20 -08:00
871a93d58f Move most package metadata to workspace 2025-02-22 12:39:20 -08:00
4b7cbd4f9b chore: Release 2025-02-22 12:39:19 -08:00
aa2a9815df Add automatic per-email address unread folders 2025-02-22 12:38:57 -08:00
2e5b18a008 Fix cargo-udeps build step 2025-02-22 12:37:27 -08:00
d0a38114cc Add cargo-udeps build step 2025-02-22 12:37:27 -08:00
ccc1d516c7 fix(deps): update rust crate letterbox-notmuch to 0.8.0
All checks were successful
Continuous integration / Test Suite (pull_request) Successful in 40s
Continuous integration / Trunk (pull_request) Successful in 38s
Continuous integration / Check (pull_request) Successful in 1m44s
Continuous integration / Rustfmt (pull_request) Successful in 33s
Continuous integration / build (pull_request) Successful in 1m57s
Continuous integration / Check (push) Successful in 38s
Continuous integration / Trunk (push) Successful in 40s
Continuous integration / Test Suite (push) Successful in 1m45s
Continuous integration / Rustfmt (push) Successful in 33s
Continuous integration / build (push) Successful in 2m18s
2025-02-22 19:15:52 +00:00
246b710fdd fix(deps): update rust crate log to v0.4.26
All checks were successful
Continuous integration / Check (pull_request) Successful in 35s
Continuous integration / Test Suite (pull_request) Successful in 39s
Continuous integration / Trunk (pull_request) Successful in 37s
Continuous integration / Rustfmt (pull_request) Successful in 31s
Continuous integration / build (pull_request) Successful in 47s
Continuous integration / Test Suite (push) Successful in 40s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / Check (push) Successful in 1m49s
Continuous integration / Rustfmt (push) Successful in 32s
Continuous integration / build (push) Successful in 2m45s
2025-02-21 05:46:06 +00:00
1a21c9fa8e fix(deps): update rust crate uuid to v1.14.0
All checks were successful
Continuous integration / Check (pull_request) Successful in 53s
Continuous integration / Test Suite (pull_request) Successful in 39s
Continuous integration / Trunk (pull_request) Successful in 37s
Continuous integration / Rustfmt (pull_request) Successful in 33s
Continuous integration / build (pull_request) Successful in 1m18s
Continuous integration / Check (push) Successful in 35s
Continuous integration / Test Suite (push) Successful in 39s
Continuous integration / Trunk (push) Successful in 37s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / build (push) Successful in 45s
2025-02-21 00:30:51 +00:00
9fd912b1d4 fix(deps): update rust crate serde to v1.0.218
All checks were successful
Continuous integration / Test Suite (pull_request) Successful in 41s
Continuous integration / Trunk (pull_request) Successful in 38s
Continuous integration / Check (pull_request) Successful in 1m51s
Continuous integration / Rustfmt (pull_request) Successful in 32s
Continuous integration / build (pull_request) Successful in 3m5s
Continuous integration / Check (push) Successful in 48s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / build (push) Successful in 48s
Continuous integration / Test Suite (push) Successful in 2m52s
2025-02-20 05:31:10 +00:00
9ded32f97b fix(deps): update rust crate anyhow to v1.0.96
All checks were successful
Continuous integration / Test Suite (pull_request) Successful in 40s
Continuous integration / Trunk (pull_request) Successful in 37s
Continuous integration / Check (pull_request) Successful in 1m52s
Continuous integration / Rustfmt (pull_request) Successful in 32s
Continuous integration / build (pull_request) Successful in 2m7s
Continuous integration / Check (push) Successful in 35s
Continuous integration / Test Suite (push) Successful in 39s
Continuous integration / Rustfmt (push) Successful in 32s
Continuous integration / Trunk (push) Successful in 1m26s
Continuous integration / build (push) Successful in 46s
2025-02-20 03:16:55 +00:00
10aac046bc fix(deps): update rust crate serde_json to v1.0.139
All checks were successful
Continuous integration / Check (pull_request) Successful in 36s
Continuous integration / Test Suite (pull_request) Successful in 40s
Continuous integration / Rustfmt (pull_request) Successful in 31s
Continuous integration / Trunk (pull_request) Successful in 1m24s
Continuous integration / build (pull_request) Successful in 47s
Continuous integration / Test Suite (push) Successful in 40s
Continuous integration / Trunk (push) Successful in 37s
Continuous integration / Check (push) Successful in 1m42s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / build (push) Successful in 1m55s
2025-02-20 03:00:53 +00:00
f4527baf89 fix(deps): update rust crate seed_hooks to 0.4.0
All checks were successful
Continuous integration / Check (pull_request) Successful in 1m25s
Continuous integration / Test Suite (pull_request) Successful in 39s
Continuous integration / Rustfmt (pull_request) Successful in 33s
Continuous integration / build (pull_request) Successful in 46s
Continuous integration / Trunk (pull_request) Successful in 1m37s
Continuous integration / Check (push) Successful in 54s
Continuous integration / Trunk (push) Successful in 36s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / build (push) Successful in 46s
Continuous integration / Test Suite (push) Successful in 4m11s
2025-02-18 20:15:48 +00:00
11ec5bf747 fix(deps): update rust crate uuid to v1.13.2
All checks were successful
Continuous integration / Test Suite (pull_request) Successful in 40s
Continuous integration / Check (pull_request) Successful in 1m30s
Continuous integration / Trunk (pull_request) Successful in 37s
Continuous integration / Rustfmt (pull_request) Successful in 47s
Continuous integration / build (pull_request) Successful in 48s
Continuous integration / Check (push) Successful in 36s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / Test Suite (push) Successful in 2m8s
Continuous integration / build (push) Successful in 47s
2025-02-17 23:46:05 +00:00
6a53679755 fix(deps): update rust crate clap to v4.5.30
All checks were successful
Continuous integration / Check (pull_request) Successful in 37s
Continuous integration / Trunk (pull_request) Successful in 37s
Continuous integration / Rustfmt (pull_request) Successful in 31s
Continuous integration / Test Suite (pull_request) Successful in 2m0s
Continuous integration / build (pull_request) Successful in 46s
Continuous integration / Check (push) Successful in 35s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / Test Suite (push) Successful in 1m55s
Continuous integration / build (push) Successful in 47s
2025-02-17 19:15:50 +00:00
7bedec0692 chore(deps): lock file maintenance
All checks were successful
Continuous integration / Test Suite (pull_request) Successful in 41s
Continuous integration / Check (pull_request) Successful in 1m34s
Continuous integration / Trunk (pull_request) Successful in 37s
Continuous integration / Rustfmt (pull_request) Successful in 50s
Continuous integration / build (pull_request) Successful in 47s
Continuous integration / Check (push) Successful in 36s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / build (push) Successful in 47s
Continuous integration / Test Suite (push) Successful in 2m48s
2025-02-17 00:01:14 +00:00
78feb95811 chore: Release
All checks were successful
Continuous integration / Test Suite (push) Successful in 1m27s
Continuous integration / Check (push) Successful in 1m39s
Continuous integration / Trunk (push) Successful in 45s
Continuous integration / Rustfmt (push) Successful in 53s
Continuous integration / build (push) Successful in 1m10s
2025-02-15 14:49:11 -08:00
3aad2bb80e web: another attempt to fix progress bar 2025-02-15 14:47:32 -08:00
0df8de3661 web: use seed_hooks ability to create ev handlers 2025-02-15 14:47:32 -08:00
83ecc73fbd fix(deps): update rust crate seed_hooks to v0.1.16
All checks were successful
Continuous integration / Test Suite (pull_request) Successful in 41s
Continuous integration / Check (pull_request) Successful in 1m24s
Continuous integration / Trunk (pull_request) Successful in 39s
Continuous integration / Rustfmt (pull_request) Successful in 48s
Continuous integration / build (pull_request) Successful in 48s
Continuous integration / Check (push) Successful in 35s
Continuous integration / Trunk (push) Successful in 37s
Continuous integration / Test Suite (push) Successful in 1m41s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / build (push) Successful in 1m49s
2025-02-14 01:15:49 +00:00
c10313cd12 fix(deps): update rust crate letterbox-shared to 0.6.0
All checks were successful
Continuous integration / Test Suite (pull_request) Successful in 39s
Continuous integration / Check (pull_request) Successful in 1m24s
Continuous integration / Trunk (pull_request) Successful in 37s
Continuous integration / Rustfmt (pull_request) Successful in 48s
Continuous integration / build (pull_request) Successful in 46s
Continuous integration / Check (push) Successful in 35s
Continuous integration / Trunk (push) Successful in 37s
Continuous integration / Rustfmt (push) Successful in 32s
Continuous integration / Test Suite (push) Successful in 1m44s
Continuous integration / build (push) Successful in 46s
2025-02-13 23:31:34 +00:00
4c98bcd9cb Merge pull request 'fix(deps): update rust crate letterbox-notmuch to 0.6.0' (#34) from renovate/letterbox-notmuch-0.x into master
All checks were successful
Continuous integration / Check (push) Successful in 35s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / Rustfmt (push) Successful in 32s
Continuous integration / Test Suite (push) Successful in 1m52s
Continuous integration / build (push) Successful in 46s
Reviewed-on: #34
2025-02-13 15:17:39 -08:00
004de235a8 fix(deps): update rust crate letterbox-notmuch to 0.6.0
Some checks failed
renovate/artifacts Artifact file update failure
Continuous integration / Check (push) Successful in 36s
Continuous integration / Test Suite (push) Successful in 39s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / Rustfmt (push) Successful in 52s
Continuous integration / build (push) Successful in 47s
Continuous integration / Test Suite (pull_request) Successful in 40s
Continuous integration / Check (pull_request) Successful in 1m20s
Continuous integration / Trunk (pull_request) Successful in 37s
Continuous integration / Rustfmt (pull_request) Successful in 51s
Continuous integration / build (pull_request) Successful in 48s
2025-02-13 23:16:31 +00:00
90dbeb6f20 chore: Release
All checks were successful
Continuous integration / Test Suite (push) Successful in 40s
Continuous integration / Trunk (push) Successful in 37s
Continuous integration / Check (push) Successful in 1m27s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / build (push) Successful in 1m54s
2025-02-13 15:09:58 -08:00
9aa298febe web: use crate version of seed_hooks 2025-02-13 15:09:34 -08:00
5a13a497dc chore: Release 2025-02-13 14:30:47 -08:00
37711e14dd chore: Release 2025-02-13 14:01:24 -08:00
e89fd28707 web: pin seed_hooks version 2025-02-13 14:01:06 -08:00
7a91ee2f49 chore: Release 2025-02-13 13:29:52 -08:00
4b76ea5392 Justfile: run release w/ --no-confirm 2025-02-13 13:29:29 -08:00
d2a81b7bd9 Revert "Justfile: try without --workspace flag"
This reverts commit 9dd39509b5.
2025-02-13 13:29:17 -08:00
9dd39509b5 Justfile: try without --workspace flag 2025-02-13 13:28:35 -08:00
d605bcfe7a web: move to version 0.3 to sync with other crates 2025-02-13 13:25:01 -08:00
73abdb535a Justfile: actually call _release on build 2025-02-13 11:56:09 -08:00
ab9506c4f6 Starter justfile that will hopefully replace make 2025-02-13 11:51:59 -08:00
994a629401 web: update letterbox-notmuch dependency
All checks were successful
Continuous integration / Check (push) Successful in 37s
Continuous integration / Trunk (push) Successful in 50s
Continuous integration / Rustfmt (push) Successful in 32s
Continuous integration / build (push) Successful in 46s
Continuous integration / Test Suite (push) Successful in 2m41s
2025-02-13 11:37:32 -08:00
00c55160a7 Add web back to workspace 2025-02-13 11:31:43 -08:00
e3c6edb894 Merge pull request 'fix(deps): update rust crate letterbox-shared to 0.3.0' (#35) from renovate/letterbox-shared-0.x into master
Some checks failed
Continuous integration / Test Suite (push) Successful in 41s
Continuous integration / Trunk (push) Failing after 32s
Continuous integration / Check (push) Successful in 1m23s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / build (push) Successful in 1m49s
Reviewed-on: #35
2025-02-13 11:31:21 -08:00
4574c016cd fix(deps): update rust crate letterbox-shared to 0.3.0
Some checks failed
renovate/artifacts Artifact file update failure
Continuous integration / Test Suite (push) Successful in 39s
Continuous integration / Check (push) Successful in 1m12s
Continuous integration / Trunk (push) Failing after 33s
Continuous integration / Rustfmt (push) Successful in 47s
Continuous integration / build (push) Successful in 46s
Continuous integration / Test Suite (pull_request) Successful in 39s
Continuous integration / Check (pull_request) Successful in 1m16s
Continuous integration / Trunk (pull_request) Failing after 32s
Continuous integration / Rustfmt (pull_request) Successful in 47s
Continuous integration / build (pull_request) Successful in 46s
2025-02-13 18:45:52 +00:00
ca6c19f4c8 chore: Release
Some checks failed
Continuous integration / Check (push) Successful in 35s
Continuous integration / Test Suite (push) Successful in 39s
Continuous integration / Trunk (push) Failing after 6m23s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / build (push) Successful in 1m47s
2025-02-13 10:32:43 -08:00
0f51f6e71f server: copy vars.css from web so I can publish release 2025-02-13 10:32:20 -08:00
4bd672bf94 chore: Release 2025-02-13 10:18:40 -08:00
136fd77f3b Add server back to workspace 2025-02-13 10:18:30 -08:00
ee9b6be95e Temporarily remove web and server from workspace to publish other crates
Some checks failed
Continuous integration / Test Suite (push) Successful in 28s
Continuous integration / Check (push) Successful in 42s
Continuous integration / Trunk (push) Failing after 28s
Continuous integration / Rustfmt (push) Successful in 36s
Continuous integration / build (push) Successful in 27s
2025-02-13 10:16:55 -08:00
38c553d385 Use packaged version of crates 2025-02-13 10:16:36 -08:00
1b073665a7 chore: Release 2025-02-13 09:49:11 -08:00
2076596f50 Rename all crates to start with letterbox- 2025-02-13 09:48:24 -08:00
d1beaded09 Update Cargo.toml for packaging 2025-02-13 09:47:41 -08:00
2562bdfedf server: tool for testing inline code 2025-02-13 09:47:41 -08:00
86c6face7d server: sql to debug search indexing w/ postgres 2025-02-13 09:47:41 -08:00
4a7ff8bf7b notmuch: exclude testdata dir when packaging
Contains filenames cargo package doesn't like
2025-02-13 09:47:41 -08:00
8c280d3616 web: fix styling for slashdot's story byline 2025-02-13 09:47:41 -08:00
eb4d4164ef web: fix progress bar on mobile 2025-02-13 09:47:41 -08:00
c7740811bf fix(deps): update rust crate opentelemetry to 0.28.0
All checks were successful
Continuous integration / Test Suite (pull_request) Successful in 38s
Continuous integration / Check (pull_request) Successful in 1m22s
Continuous integration / Trunk (pull_request) Successful in 38s
Continuous integration / Rustfmt (pull_request) Successful in 51s
Continuous integration / build (pull_request) Successful in 46s
Continuous integration / Test Suite (push) Successful in 39s
Continuous integration / Trunk (push) Successful in 37s
Continuous integration / Check (push) Successful in 1m22s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / build (push) Successful in 1m49s
2025-02-12 21:30:57 +00:00
55679cf61b fix(deps): update rust crate xtracing to 0.2.0
All checks were successful
Continuous integration / Test Suite (pull_request) Successful in 38s
Continuous integration / Check (pull_request) Successful in 1m18s
Continuous integration / Trunk (pull_request) Successful in 37s
Continuous integration / Rustfmt (pull_request) Successful in 50s
Continuous integration / build (pull_request) Successful in 46s
Continuous integration / Test Suite (push) Successful in 39s
Continuous integration / Trunk (push) Successful in 38s
Continuous integration / Check (push) Successful in 1m35s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / build (push) Successful in 1m42s
2025-02-12 21:15:55 +00:00
1b1c80b1b8 web: annotate some more (temporary) dead code
All checks were successful
Continuous integration / Check (push) Successful in 36s
Continuous integration / Trunk (push) Successful in 37s
Continuous integration / Rustfmt (push) Successful in 32s
Continuous integration / Test Suite (push) Successful in 1m52s
Continuous integration / build (push) Successful in 46s
2025-02-12 13:03:45 -08:00
8743b1f56b web: install trunk in CI
Some checks failed
Continuous integration / Test Suite (push) Successful in 39s
Continuous integration / Check (push) Successful in 1m17s
Continuous integration / Rustfmt (push) Successful in 50s
Continuous integration / build (push) Successful in 1m41s
Continuous integration / Trunk (push) Failing after 3m38s
2025-02-12 11:46:31 -08:00
eb6f1b5346 web: run trunk build in CI
Some checks failed
Continuous integration / Check (push) Successful in 35s
Continuous integration / Test Suite (push) Successful in 40s
Continuous integration / Rustfmt (push) Successful in 32s
Continuous integration / Trunk (push) Failing after 58s
Continuous integration / build (push) Successful in 46s
2025-02-12 09:03:37 -08:00
6bb6d380a9 Bumping version to 0.0.144
All checks were successful
Continuous integration / Check (push) Successful in 37s
Continuous integration / Rustfmt (push) Successful in 51s
Continuous integration / Test Suite (push) Successful in 1m44s
Continuous integration / build (push) Successful in 53s
2025-02-12 08:50:09 -08:00
39eea04bf6 Bumping version to 0.0.143 2025-02-12 08:50:04 -08:00
2711147cd6 web: hide nautilus ads 2025-02-12 08:50:04 -08:00
083b7c9f1c Merge pull request 'fix(deps): update rust crate thiserror to v2' (#27) from renovate/thiserror-2.x into master
All checks were successful
Continuous integration / Check (push) Successful in 34s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / Test Suite (push) Successful in 1m45s
Continuous integration / build (push) Successful in 45s
Reviewed-on: #27
2025-02-11 20:27:41 -08:00
5ade886a72 fix(deps): update rust-wasm-bindgen monorepo
All checks were successful
Continuous integration / Test Suite (pull_request) Successful in 38s
Continuous integration / Check (pull_request) Successful in 1m32s
Continuous integration / Rustfmt (pull_request) Successful in 31s
Continuous integration / build (pull_request) Successful in 2m0s
Continuous integration / Test Suite (push) Successful in 38s
Continuous integration / Check (push) Successful in 1m32s
Continuous integration / Rustfmt (push) Successful in 33s
Continuous integration / build (push) Successful in 2m9s
2025-02-12 00:46:04 +00:00
52575e13f6 Bumping version to 0.0.142
All checks were successful
Continuous integration / Check (push) Successful in 37s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / Test Suite (push) Successful in 1m32s
Continuous integration / build (push) Successful in 46s
2025-02-11 16:42:24 -08:00
3aaee8add3 web: rollback wasm-bindgen 2025-02-11 16:42:10 -08:00
5e188a70f9 fix(deps): update rust crate clap to v4.5.29
All checks were successful
Continuous integration / Test Suite (pull_request) Successful in 39s
Continuous integration / Rustfmt (pull_request) Successful in 49s
Continuous integration / build (pull_request) Successful in 45s
Continuous integration / Check (pull_request) Successful in 3m7s
Continuous integration / Rustfmt (push) Successful in 31s
Continuous integration / build (push) Successful in 44s
Continuous integration / Check (push) Successful in 1m46s
Continuous integration / Test Suite (push) Successful in 5m36s
2025-02-11 20:00:45 +00:00
f9e5c87d2b fix(deps): update rust-wasm-bindgen monorepo
All checks were successful
Continuous integration / Test Suite (pull_request) Successful in 38s
Continuous integration / Check (pull_request) Successful in 1m20s
Continuous integration / Rustfmt (pull_request) Successful in 32s
Continuous integration / build (pull_request) Successful in 1m36s
Continuous integration / Rustfmt (push) Successful in 32s
Continuous integration / Check (push) Successful in 1m22s
Continuous integration / build (push) Successful in 45s
Continuous integration / Test Suite (push) Successful in 5m40s
2025-02-11 16:46:05 +00:00
7d40cf8a4a Bumping version to 0.0.141
All checks were successful
Continuous integration / Check (push) Successful in 37s
Continuous integration / Test Suite (push) Successful in 41s
Continuous integration / Rustfmt (push) Successful in 1m1s
Continuous integration / build (push) Successful in 47s
2025-02-11 08:36:30 -08:00
1836026736 update cacher dependency 2025-02-11 08:36:24 -08:00
79db0f8cfa Bumping version to 0.0.140
All checks were successful
Continuous integration / Check (push) Successful in 36s
Continuous integration / Rustfmt (push) Successful in 32s
Continuous integration / build (push) Successful in 46s
Continuous integration / Test Suite (push) Successful in 5m10s
2025-02-10 17:44:22 -08:00
95c29dc73c web: CSS indent lists 2025-02-10 17:44:07 -08:00
2b0ee42cdc Bumping version to 0.0.139
Some checks are pending
Continuous integration / Check (push) Waiting to run
Continuous integration / Test Suite (push) Waiting to run
Continuous integration / Rustfmt (push) Waiting to run
Continuous integration / build (push) Waiting to run
2025-02-10 17:33:46 -08:00
c90ac1d4fc web: ping web-sys to 0.2.95, to work with CLI in nixos 2025-02-10 17:33:17 -08:00
a9803bb6a1 fix(deps): update rust crate thiserror to v2
Some checks failed
renovate/artifacts Artifact file update failure
Continuous integration / Check (push) Successful in 35s
Continuous integration / Rustfmt (push) Successful in 32s
Continuous integration / build (push) Successful in 46s
Continuous integration / Test Suite (push) Successful in 1m49s
Continuous integration / Check (pull_request) Successful in 48s
Continuous integration / Rustfmt (pull_request) Successful in 32s
Continuous integration / Test Suite (pull_request) Successful in 2m2s
Continuous integration / build (pull_request) Successful in 7m15s
2025-02-11 01:31:42 +00:00
74219ad333 web: fix uuid dep
All checks were successful
Continuous integration / Check (push) Successful in 1m49s
Continuous integration / Rustfmt (push) Successful in 34s
Continuous integration / build (push) Successful in 1m19s
Continuous integration / Test Suite (push) Successful in 4m17s
2025-02-10 17:28:04 -08:00
2073b7b132 Changes necessary for latest cargo packages 2025-02-10 14:57:40 -08:00
58dae5df6f gitea: initial setup
Some checks failed
Continuous integration / Check (push) Failing after 2m48s
Continuous integration / Test Suite (push) Failing after 4m52s
Continuous integration / Rustfmt (push) Successful in 58s
Continuous integration / build (push) Failing after 4m58s
2025-02-09 18:34:07 -08:00
c89fc9b6d4 Merge pull request 'fix(deps): update rust crate mailparse to 0.16.0' (#28) from renovate/mailparse-0.x into master
Reviewed-on: #28
2025-02-09 15:33:57 -08:00
f7ab08c1e6 fix(deps): update rust crate mailparse to 0.16.0 2025-02-09 23:30:40 +00:00
221fead7dc cargo update 2025-02-09 14:54:13 -08:00
3491cb9593 Merge pull request 'fix(deps): update rust crate tokio to v1.43.0' (#24) from renovate/tokio-1.x-lockfile into master
Reviewed-on: #24
2025-02-09 14:52:02 -08:00
037b3231ac fix(deps): update rust crate tokio to v1.43.0 2025-02-09 22:45:44 +00:00
75f38c1e94 Merge pull request 'fix(deps): update rust crate scraper to 0.22.0' (#23) from renovate/scraper-0.x into master
Reviewed-on: #23
2025-02-09 14:30:43 -08:00
977bcd0bf4 Merge pull request 'fix(deps): update rust crate itertools to 0.14.0' (#22) from renovate/itertools-0.x into master
Reviewed-on: #22
2025-02-09 14:30:33 -08:00
838459e5a8 Merge pull request 'fix(deps): update rust crate graphql_client to 0.14.0' (#21) from renovate/graphql_client-0.x into master
Reviewed-on: #21
2025-02-09 14:30:21 -08:00
d208a31348 Merge pull request 'fix(deps): update rust crate gloo-net to 0.6.0' (#20) from renovate/gloo-net-0.x into master
Reviewed-on: #20
2025-02-09 14:30:12 -08:00
0a640bea6f Merge pull request 'fix(deps): update rust crate css-inline to 0.14.0' (#19) from renovate/css-inline-0.x into master
Reviewed-on: #19
2025-02-09 14:30:02 -08:00
84a2962561 Merge pull request 'chore(deps): update dependency font-awesome to v6.7.2' (#18) from renovate/font-awesome-6.x into master
Reviewed-on: #18
2025-02-09 14:29:49 -08:00
6c71be7a3a Merge pull request 'fix(deps): update rust crate xtracing to v0.1.3' (#16) from renovate/xtracing-0.x-lockfile into master
Reviewed-on: #16
2025-02-09 14:29:36 -08:00
77562505b4 Merge pull request 'fix(deps): update rust crate sqlx to v0.8.3' (#15) from renovate/sqlx-0.x-lockfile into master
Reviewed-on: #15
2025-02-09 14:29:24 -08:00
c83d3dcf1d Merge pull request 'fix(deps): update rust crate serde_json to v1.0.138' (#14) from renovate/serde_json-1.x-lockfile into master
Reviewed-on: #14
2025-02-09 14:29:03 -08:00
081077d2c2 Merge pull request 'fix(deps): update rust crate serde to v1.0.217' (#13) from renovate/serde-monorepo into master
Reviewed-on: #13
2025-02-09 14:28:53 -08:00
4cfc6a73fc Merge pull request 'fix(deps): update rust crate log to v0.4.25' (#11) from renovate/log-0.x-lockfile into master
Reviewed-on: #11
2025-02-09 14:28:43 -08:00
f1c132830f Merge pull request 'fix(deps): update rust crate clap to v4.5.28' (#10) from renovate/clap-4.x-lockfile into master
Reviewed-on: #10
2025-02-09 14:28:30 -08:00
5aff7c6e85 Merge pull request 'fix(deps): update rust crate cacher to v0.1.4' (#9) from renovate/cacher-0.x-lockfile into master
Reviewed-on: #9
2025-02-09 14:28:19 -08:00
2c09713e20 Merge pull request 'fix(deps): update rust crate async-trait to v0.1.86' (#7) from renovate/async-trait-0.x-lockfile into master
Reviewed-on: #7
2025-02-09 14:28:07 -08:00
3d544feeb5 Merge pull request 'fix(deps): update rust crate ammonia to v4' (#25) from renovate/ammonia-4.x into master
Reviewed-on: #25
2025-02-09 13:57:23 -08:00
5830ed0bb1 Merge branch 'master' into renovate/ammonia-4.x 2025-02-09 13:57:13 -08:00
83aed683f5 fix(deps): update rust crate sqlx to v0.8.3 2025-02-09 21:15:54 +00:00
72385b3987 Merge pull request 'fix(deps): update rust crate lol_html to v2' (#26) from renovate/lol_html-2.x into master
Reviewed-on: #26
2025-02-09 13:01:28 -08:00
f21893b52e Bumping version to 0.0.138 2025-02-09 12:52:36 -08:00
0b81529509 build-info: one last version bump 2025-02-09 12:52:23 -08:00
9790bbea83 Bumping version to 0.0.137 2025-02-09 12:49:53 -08:00
7aa620a9da Update all build-info versions to fix build 2025-02-09 12:49:25 -08:00
2e67db0b4e fix(deps): update rust crate css-inline to 0.14.0 2025-02-09 20:30:48 +00:00
cd777b2894 fix(deps): update rust crate lol_html to v2 2025-02-09 20:17:15 +00:00
049e9728a2 fix(deps): update rust crate ammonia to v4 2025-02-09 20:17:10 +00:00
0952cdf9cb fix(deps): update rust crate scraper to 0.22.0 2025-02-09 20:16:59 +00:00
5f4a4e81cb fix(deps): update rust crate itertools to 0.14.0 2025-02-09 20:16:54 +00:00
38c2c508e8 fix(deps): update rust crate graphql_client to 0.14.0 2025-02-09 20:16:48 +00:00
4cd3664e32 fix(deps): update rust crate gloo-net to 0.6.0 2025-02-09 20:16:44 +00:00
71996f6c48 chore(deps): update dependency font-awesome to v6.7.2 2025-02-09 20:16:33 +00:00
6e227de00f fix(deps): update rust crate xtracing to v0.1.3 2025-02-09 20:16:24 +00:00
3576e67af7 Merge pull request 'fix(deps): update rust crate reqwest to v0.12.12' (#12) from renovate/reqwest-0.x-lockfile into master
Reviewed-on: #12
2025-02-09 12:16:14 -08:00
19f0f60653 fix(deps): update rust crate serde_json to v1.0.138 2025-02-09 20:16:12 +00:00
3502eeb711 fix(deps): update rust crate serde to v1.0.217 2025-02-09 20:16:02 +00:00
fd770d03ab fix(deps): update rust crate reqwest to v0.12.12 2025-02-09 20:15:54 +00:00
d99b7ae34c fix(deps): update rust crate log to v0.4.25 2025-02-09 20:15:48 +00:00
f18aa8c8d4 fix(deps): update rust crate clap to v4.5.28 2025-02-09 20:15:35 +00:00
dcdcb5b5a3 fix(deps): update rust crate cacher to v0.1.4 2025-02-09 20:15:31 +00:00
884e4b5831 fix(deps): update rust crate async-trait to v0.1.86 2025-02-09 20:15:19 +00:00
5981356492 Merge pull request 'fix(deps): update rust crate async-graphql-rocket to v7.0.15' (#6) from renovate/async-graphql-rocket-7.x-lockfile into master
Reviewed-on: #6
2025-02-09 12:10:15 -08:00
386b6915c5 fix(deps): update rust crate async-graphql-rocket to v7.0.15 2025-02-09 20:09:39 +00:00
5a6f04536f Merge pull request 'chore(deps): update rust crate build-info-build to 0.0.39' (#2) from renovate/build-info-build-0.x into master
Reviewed-on: #2
2025-02-09 11:21:21 -08:00
ae1d9e6db7 Merge pull request 'fix(deps): update rust crate anyhow to v1.0.95' (#3) from renovate/anyhow-1.x-lockfile into master
Reviewed-on: #3
2025-02-09 11:21:03 -08:00
24d50c21f5 fix(deps): update rust crate anyhow to v1.0.95 2025-02-09 19:08:21 +00:00
b4d72da639 chore(deps): update rust crate build-info-build to 0.0.39 2025-02-09 19:08:17 +00:00
dacb258289 Merge pull request 'chore: Configure Renovate' (#1) from renovate/configure into master
Reviewed-on: #1
2025-02-09 11:06:35 -08:00
5c674d4603 Add renovate.json 2025-02-09 19:01:46 +00:00
2e9753e91d Bumping version to 0.0.136 2025-02-06 08:17:10 -08:00
971e1049c7 web: allow plaintext emails to wrap 2025-02-06 08:16:53 -08:00
11c76332f3 Bumping version to 0.0.135 2025-02-06 07:46:34 -08:00
52d03ae964 web: tweak figure bg color on hackaday 2025-02-06 07:46:13 -08:00
c4043f6c56 Bumping version to 0.0.134 2025-02-05 09:18:40 -08:00
dfbac38281 web: style blockquotes in emails 2025-02-05 09:18:05 -08:00
f857c38625 Bumping version to 0.0.133 2025-02-02 09:52:05 -08:00
23823cd85e web: provide CSS overrides for email matching news posts 2025-02-02 09:51:27 -08:00
30b5d0ff9f Bumping version to 0.0.132 2025-01-30 20:19:21 -08:00
60a3b1ef88 web: remove accidentally committed line 2025-01-30 20:18:36 -08:00
a46390d110 Bumping version to 0.0.131 2025-01-30 17:45:35 -08:00
5baac0c77a web: fix width overflow on mobile and maybe progress bar 2025-01-30 17:45:14 -08:00
e6181d41ed web: address a bunch of dead code lint 2025-01-30 15:24:11 -08:00
6a228cfd5e Bumping version to 0.0.130 2025-01-30 14:16:30 -08:00
8d81067206 cargo sqlx prepare 2025-01-30 14:16:29 -08:00
b2e47a9bd4 server: round-robin by site when indexing searches 2025-01-30 14:16:12 -08:00
4eaf50cde4 Bumping version to 0.0.129 2025-01-30 13:55:52 -08:00
f20afe5447 update sqlx prepare 2025-01-30 13:55:38 -08:00
53093f4cce Bumping version to 0.0.128 2025-01-30 13:52:55 -08:00
9324a34d31 cargo sqlx prepare 2025-01-30 13:52:54 -08:00
eecc4bc3ef server: strip style & script tags, also handle some retryable errors on slurp 2025-01-30 13:52:22 -08:00
795029cb06 Bumping version to 0.0.127 2025-01-29 17:25:55 -08:00
bc0135106f server: error when get request has a bad response code 2025-01-29 17:25:26 -08:00
bd2803f81c Bumping version to 0.0.126 2025-01-29 17:10:42 -08:00
215addc2c0 cargo sqlx prepare 2025-01-29 17:10:41 -08:00
69f8e24689 server: index newest news posts first 2025-01-29 17:10:26 -08:00
0817a7a51b Bumping version to 0.0.125 2025-01-29 17:04:16 -08:00
200933591a cargo sqlx prepare 2025-01-29 17:04:15 -08:00
8b7c819b17 server: only index 100 search summaries at a time 2025-01-29 17:03:47 -08:00
dce433ab5a Bumping version to 0.0.124 2025-01-29 16:53:59 -08:00
eb4f2d8b5d server: filter out bad urls when indexing search summary 2025-01-29 16:53:38 -08:00
2008457911 Bumping version to 0.0.123 2025-01-29 16:13:50 -08:00
f6b57e63fd cargo sqlx prepare 2025-01-29 16:13:50 -08:00
d681612e8e server: index all search summaries on refresh 2025-01-29 16:13:44 -08:00
80454cbc7e Bumping version to 0.0.122 2025-01-29 15:44:05 -08:00
78cf59333e cargo sqlx prepare 2025-01-29 15:44:04 -08:00
ab47f32b52 server: fetch search summaries in parallel 2025-01-29 15:43:46 -08:00
d9d58afed9 Bumping version to 0.0.121 2025-01-29 15:24:55 -08:00
d01f9a7e08 cargo sqlx prepare 2025-01-29 15:24:54 -08:00
c6aabf88b9 server: sample DB for missing indexes, should prevent duplication from separate threads 2025-01-29 14:42:59 -08:00
29bf6d9b6d Bumping version to 0.0.120 2025-01-29 14:08:55 -08:00
92bf45bd15 cargo sqlx prepare 2025-01-29 14:08:54 -08:00
12c8e0e33b server: use fetched contents of news for search index 2025-01-29 14:08:20 -08:00
c7aa32b922 Bumping version to 0.0.119 2025-01-28 09:34:56 -08:00
94be4ec572 web: add archive buttons, and adjust when text on buttons is shown 2025-01-28 09:34:36 -08:00
66c299bc4c Bumping version to 0.0.118 2025-01-27 15:48:12 -08:00
d5c4176392 cargo sqlx prepare 2025-01-27 15:48:11 -08:00
bd00542c28 server: use clean_summary field instead of summary 2025-01-27 15:47:55 -08:00
19f029cb6b Bumping version to 0.0.117 2025-01-27 14:15:00 -08:00
198db1492a server: add another The Onion slurp config 2025-01-27 14:14:46 -08:00
f6665b6b6e Bumping version to 0.0.116 2025-01-27 14:01:30 -08:00
ee93d725ba web & server: finish initial tailwind rewrite 2025-01-27 14:00:46 -08:00
70fb635eda server: index on nzb_posts created_at, attempt to speed up homepage 2025-01-27 13:18:36 -08:00
b9fbefe05c server: format chrome css 2025-01-27 13:17:22 -08:00
46f823baae server: use local slurp cache separate from production 2025-01-27 13:16:55 -08:00
cc1e998ec5 web: style version chart 2025-01-26 16:01:35 -08:00
fb73d8272e web: update style for rendering emails, including attachments 2025-01-26 15:56:08 -08:00
87321fb669 web: update stylings for removable tag chiclets 2025-01-26 14:02:39 -08:00
44b60d5070 web: style checkboxes, tweak mobile search bar width 2025-01-26 13:42:20 -08:00
89897aa48f web: style search toolbar 2025-01-26 12:24:06 -08:00
b2879211e4 web: much nicer tag list styling with flex box 2025-01-26 10:58:27 -08:00
6b3567fb1b web: style tag list 2025-01-26 09:42:32 -08:00
c27bcac549 web: switch to debug build and enable minimal optimizations to make wasm work 2025-01-26 09:32:06 -08:00
25d31a6ce7 web: only use one view function, desktop/tablet/mobile handled in CSS 2025-01-26 09:31:44 -08:00
ea280dd366 web: stub out all C![] that need porting to tailwind 2025-01-25 16:56:44 -08:00
9842c8c99c server: add option to inline CSS before slurping contents 2025-01-25 16:09:05 -08:00
906ebd73b2 cargo: don't default to xinu repo, that was misguided 2025-01-25 16:05:05 -08:00
de95781ce7 More lint 2025-01-24 09:38:56 -08:00
c58234fa2e Lint 2025-01-24 09:37:49 -08:00
4099bbe732 Bumping version to 0.0.115 2025-01-19 17:22:37 -08:00
c693d4e78a server: strip html from search index of summaries 2025-01-19 17:22:24 -08:00
f90ff72316 server: fix tantivy/newsreader search bug 2025-01-19 17:22:20 -08:00
bed6ae01f2 Bumping version to 0.0.114 2025-01-19 16:50:50 -08:00
087d6b9a60 Use registry version of formerly git dependencies 2025-01-19 16:50:14 -08:00
b04caa9d5d Bumping version to 0.0.113 2025-01-17 15:51:39 -08:00
17b1125ea3 server: Use crate version of cacher 2025-01-17 15:51:28 -08:00
a8ac79d396 Bumping version to 0.0.112 2025-01-16 16:09:28 -08:00
30cbc260dc web: version bump wasm-bindgen-cli 2025-01-16 16:09:06 -08:00
4601b7e6d3 Bumping version to 0.0.111 2025-01-15 12:27:33 -08:00
28b6f565fd update cacher dependency 2025-01-15 12:27:29 -08:00
48b63b19d5 Bumping version to 0.0.110 2025-01-14 20:55:53 -08:00
184afbb4ee update cacher dependency 2025-01-14 20:55:49 -08:00
f6217810ea Bumping version to 0.0.109 2025-01-14 16:22:24 -08:00
46e2de341b update cacher dependency 2025-01-14 16:22:20 -08:00
9c56fde0b6 Bumping version to 0.0.108 2025-01-14 12:05:38 -08:00
2051e5ebf2 cargo sqlx prepare 2025-01-14 12:05:37 -08:00
5a997e61da web & server: add support for email photos 2025-01-14 12:05:03 -08:00
f27f0deb38 Revert "Remove DB tables that don't seem to work"
This reverts commit 70f437b939.
2025-01-13 21:03:56 -08:00
70f437b939 Remove DB tables that don't seem to work 2025-01-13 20:50:19 -08:00
59648a1b25 Bumping version to 0.0.107 2025-01-12 16:35:17 -08:00
76482c6c15 server: make pagination slightly less bad 2025-01-12 16:35:11 -08:00
de23bae8bd server: add request_id to all graphql logging 2025-01-12 11:40:31 -08:00
e07c0616a2 Bumping version to 0.0.106 2025-01-12 09:26:23 -08:00
13a7de4956 web: refactor mark read logic to be two phases 2025-01-12 09:25:44 -08:00
9ce0aacab0 Bumping version to 0.0.105 2025-01-12 08:34:36 -08:00
ae502a7dfe Bumping version to 0.0.104 2025-01-02 15:19:24 -08:00
947c5970d8 update xtracing dependency 2025-01-02 15:19:17 -08:00
686d163cf6 update xtracing dependency 2025-01-02 15:18:49 -08:00
7c720e66f9 Bumping version to 0.0.103 2024-12-28 15:10:17 -08:00
1029fd7aa2 update cacher dependency 2024-12-28 15:10:12 -08:00
61e59ea315 Bumping version to 0.0.102 2024-12-28 15:09:21 -08:00
5047094bd7 update xtracing dependency 2024-12-28 15:09:16 -08:00
28bd9a9d89 Bumping version to 0.0.101 2024-12-28 15:08:52 -08:00
4b327eeccc update xtracing dependency 2024-12-28 15:08:48 -08:00
d13b5477a5 Bumping version to 0.0.100 2024-12-28 15:08:28 -08:00
8cad404098 update xtracing dependency 2024-12-28 15:08:24 -08:00
23de7186d6 Bumping version to 0.0.99 2024-12-28 15:06:37 -08:00
a26559a07e Bumping version to 0.0.98 2024-12-28 15:04:53 -08:00
1bc7ad9b95 update xtracing dependency 2024-12-28 15:04:48 -08:00
1ac844c08d Bumping version to 0.0.97 2024-12-28 15:04:02 -08:00
d7f7954e59 Bumping version to 0.0.96 2024-12-28 15:02:21 -08:00
ba16e537e6 Bumping version to 0.0.95 2024-12-28 15:00:36 -08:00
60304a23cc update xtracing dependency 2024-12-28 15:00:30 -08:00
ce6aa7d167 Bumping version to 0.0.94 2024-12-28 09:09:41 -08:00
fb55d87876 update xtracing dependency 2024-12-28 09:09:27 -08:00
63374871ac Bumping version to 0.0.93 2024-12-28 08:44:51 -08:00
405dcc5ca6 update cacher dependency 2024-12-28 08:44:47 -08:00
1544405d3a Bumping version to 0.0.92 2024-12-28 08:43:49 -08:00
3b547f6925 update cacher dependency 2024-12-28 08:43:41 -08:00
777f33e212 notmuch: add instrumentation to most public methods 2024-12-26 11:12:47 -08:00
7c7a8c0dcb Bumping version to 0.0.91 2024-12-25 16:22:36 -08:00
6c2722314b server: fix compile problem with new PG schema 2024-12-25 16:22:19 -08:00
7827c24016 Bumping version to 0.0.90 2024-12-25 16:19:16 -08:00
043e46128a cargo sqlx prepare 2024-12-25 16:19:15 -08:00
dad30357ac server: enusre post.link is not null and not empty 2024-12-25 10:12:33 -08:00
4c6b9cde39 Bumping version to 0.0.89 2024-12-25 08:03:13 -08:00
ffb210babb server: ensure uniqueness on post links 2024-12-25 08:02:36 -08:00
145d1c1787 Bumping version to 0.0.88 2024-12-21 16:52:24 -08:00
1708526e33 update xtracing dependency 2024-12-21 16:52:19 -08:00
f8f9b753a6 Bumping version to 0.0.87 2024-12-21 16:23:13 -08:00
7fbb0e0f43 update xtracing dependency 2024-12-21 16:23:09 -08:00
2686670df7 Bumping version to 0.0.86 2024-12-21 16:21:19 -08:00
732fb5054a update xtracing dependency 2024-12-21 16:21:14 -08:00
2abfbda2f0 Bumping version to 0.0.85 2024-12-21 16:19:42 -08:00
cce693174e update xtracing dependency 2024-12-21 16:19:37 -08:00
bec7ee40b4 Bumping version to 0.0.84 2024-12-21 13:18:26 -08:00
79a6245773 update xtracing dependency 2024-12-21 13:18:22 -08:00
3ae1c3fdff Bumping version to 0.0.83 2024-12-21 13:16:50 -08:00
eab96b3f84 update xtracing dependency 2024-12-21 13:16:44 -08:00
07b8db317b cargo sqlx prepare 2024-12-21 13:16:18 -08:00
9debec8daa Bumping version to 0.0.82 2024-12-21 13:10:22 -08:00
b129b99fd9 cargo sqlx prepare 2024-12-21 13:10:21 -08:00
a397bcf190 update xtracing dependency 2024-12-21 13:10:16 -08:00
13c80fe68f update xtracing dependency 2024-12-21 13:08:01 -08:00
438ab0015e update xtracing dependency 2024-12-21 13:07:14 -08:00
93f5145937 update xtracing dependency 2024-12-21 13:06:59 -08:00
36fcc349ec update xtracing dependency 2024-12-21 13:05:31 -08:00
63a1919872 update xtracing dependency 2024-12-21 13:02:59 -08:00
5b6d18bdbc Bumping version to 0.0.81 2024-12-20 09:25:51 -08:00
868d2fb434 xtracing version bump 2024-12-20 09:25:46 -08:00
6ad66a35e7 Bumping version to 0.0.80 2024-12-20 09:18:27 -08:00
cd750e7267 Update xtracing 2024-12-20 09:16:41 -08:00
40be07cb07 Bumping version to 0.0.79 2024-12-20 09:06:45 -08:00
e794a902dd server: clean up some renamed imports 2024-12-20 09:06:35 -08:00
94576e98fc Bumping version to 0.0.78 2024-12-20 09:06:08 -08:00
b7dcb2e875 server: rename crate and binary to letterbox-server 2024-12-20 09:05:35 -08:00
aa9a243894 Bumping version to 0.0.77 2024-12-20 08:43:47 -08:00
1911367aeb cargo update 2024-12-20 08:43:37 -08:00
93bb4a27b9 Bumping version to 0.0.76 2024-12-19 18:44:31 -08:00
0456efeed4 cargo sqlx prepare 2024-12-19 18:44:30 -08:00
3ac2fa290f server: use git version of xtracing 2024-12-19 18:44:13 -08:00
e7feb73f6f lint 2024-12-19 18:38:43 -08:00
5ddb4452ff email2db: stub CLI 2024-12-19 18:35:46 -08:00
760f90762d server: refer to async_graphql extensions through extensions module 2024-12-19 18:35:03 -08:00
51154044cc WIP 2024-12-19 12:56:53 -08:00
06c5cb6cbf Update offline sqlx files on build 2024-12-19 12:50:10 -08:00
0dc1f2cebe Bumping version to 0.0.75 2024-12-19 11:35:18 -08:00
0dec7aaf0e web: pin wasm-bindgen 2024-12-19 11:35:00 -08:00
6fa8d1856a Revert "web: fix breakage do to update in dependency"
This reverts commit 80d23204fe.
2024-12-19 11:34:33 -08:00
95a0279c68 Bumping version to 0.0.74 2024-12-19 11:04:55 -08:00
80d23204fe web: fix breakage do to update in dependency 2024-12-19 11:04:39 -08:00
f45123d6d9 Bumping version to 0.0.73 2024-12-19 10:53:51 -08:00
503913c54a Bumping version to 0.0.72 2024-12-19 10:46:47 -08:00
c4627a13b6 cargo sqlx prepare 2024-12-19 10:46:39 -08:00
e4427fe725 Bumping version to 0.0.71 2024-12-19 10:44:15 -08:00
78f5f00225 cargo update 2024-12-19 10:44:05 -08:00
c6fc34136a Version bump sqlx 2024-12-19 10:44:05 -08:00
1a270997c8 Update xtracing 2024-12-19 10:38:56 -08:00
390fbcceac Bumping version to 0.0.70 2024-12-17 13:57:25 -08:00
d7214f4f29 server: move notmuch refresh out of tantivy cfg block for refresh 2024-12-17 13:57:06 -08:00
b9aaf87dc2 Bumping version to 0.0.69 2024-12-17 09:38:26 -08:00
5ee9d754ba server: actually disable tantivy 2024-12-17 09:38:19 -08:00
dc04d54455 cargo sqlx prepare 2024-12-17 09:34:03 -08:00
9f730e937d Bumping version to 0.0.68 2024-12-17 09:32:13 -08:00
13eaf33b1a server: add postgres based newsreader search and disable tantivy 2024-12-17 09:31:51 -08:00
e36f4f97f9 server: run DB migrations on startup 2024-12-16 19:21:58 -08:00
092d5781ca Bumping version to 0.0.67 2024-12-16 19:21:34 -08:00
0697a5ea41 server: more instrumentation 2024-12-16 19:21:05 -08:00
607e9e2251 Bumping version to 0.0.66 2024-12-16 08:56:24 -08:00
c547170efb server: address lint 2024-12-16 08:56:16 -08:00
0222985f4d server: instrument newsreader impl 2024-12-16 08:56:05 -08:00
94c03a9c7c Bumping version to 0.0.65 2024-12-16 08:34:53 -08:00
4f4e474e66 server: explicitly reload tantivy reader after commit 2024-12-16 08:34:35 -08:00
7a1dec03a3 Bumping version to 0.0.64 2024-12-15 16:26:38 -08:00
f49bc071c2 server: version bump xtracing 2024-12-15 16:26:22 -08:00
8551f0c756 Bumping version to 0.0.63 2024-12-15 15:43:56 -08:00
ac4aaeb0f7 server: warn on failure to open tantivy 2024-12-15 15:43:44 -08:00
4ad963c3be Bumping version to 0.0.62 2024-12-15 15:18:36 -08:00
7c943afc2b server: attempt concurrency with graphql::search and fail 2024-12-15 15:09:41 -08:00
39ea5c5458 Bumping version to 0.0.61 2024-12-15 14:46:53 -08:00
6d8b2de608 server: improve tantivy performance by reusing IndexReader
Also improve a bunch of trace logging
2024-12-15 14:46:10 -08:00
05cdcec244 notmuch: improved error handling and logging 2024-12-15 14:44:02 -08:00
a0eb8dcba6 server: add TODO 2024-12-14 11:56:33 -08:00
9fbfa378bb Bumping version to 0.0.60 2024-12-14 10:09:48 -08:00
872771b02a server: add tracing for graphql handling 2024-12-14 10:09:33 -08:00
416d82042f Bumping version to 0.0.59 2024-12-10 09:13:22 -08:00
a0eb291371 web: most post favicon more cachable 2024-12-10 09:13:11 -08:00
4c88ee18d3 Bumping version to 0.0.58 2024-12-09 13:17:09 -08:00
410e582b44 web: use favicon for avatar when viewing a post 2024-12-09 13:16:55 -08:00
a3f720a51e Bumping version to 0.0.57 2024-12-08 18:05:12 -08:00
962b3542ce web: show email address on hover of name in message view 2024-12-08 18:03:20 -08:00
a6f0971f0f Bumping version to 0.0.56 2024-11-13 17:43:16 -08:00
21789df60a server: handle attachements with name in content-type not disposition 2024-11-13 17:42:53 -08:00
584ff1504d cargo fmt to catch unformated code while LSP was misconfigured 2024-11-03 08:33:10 -08:00
caff1a1ed3 web: remove unnecessary move 2024-10-30 20:07:32 -07:00
d7b4411017 web: update cargo edition 2024-10-30 19:59:06 -07:00
66ada655fc Bumping version to 0.0.55 2024-10-29 17:16:58 -07:00
8dea1f1bd6 web: fix styling on news post tags to match email 2024-10-29 17:16:45 -07:00
e7a865204d Bumping version to 0.0.54 2024-10-27 12:27:34 -07:00
3138379e7d web: add tag when viewing news posts 2024-10-27 12:27:16 -07:00
7828fa0ac8 server: add slurper config for rustacean station 2024-10-27 12:15:43 -07:00
b770bb8986 server: add slurp config for grafana 2024-10-27 12:14:15 -07:00
07c0150d3e Bumping version to 0.0.53 2024-10-27 12:03:25 -07:00
f678338822 server: lint, including bug fix 2024-10-27 12:03:16 -07:00
6e15e69254 server: handle forwarded rfc822 messages 2024-10-27 12:02:00 -07:00
2671a3b787 Bumping version to 0.0.52 2024-10-27 10:56:11 -07:00
93073c9602 server: fix pagination counts for tantivy results 2024-10-27 10:55:49 -07:00
88f8a9d537 Bumping version to 0.0.51 2024-10-13 17:40:35 -07:00
b75b298a9d web: match email header styling when viewing post 2024-10-13 17:40:20 -07:00
031b8ce80e Bumping version to 0.0.50 2024-10-03 09:21:48 -07:00
b0ceba3bcf web: consistent html between open/close header, move padding into header code 2024-10-03 09:21:12 -07:00
e5f5b8ff3c Bumping version to 0.0.49 2024-10-03 09:04:03 -07:00
afb1d291ec web: fix right justify of read icon/timestamp on closed message header 2024-10-03 09:03:22 -07:00
55b46ff929 Bumping version to 0.0.48 2024-10-01 17:20:01 -07:00
58acd8018a web: more dense email headers 2024-10-01 17:19:52 -07:00
e0d0ede2ce Bumping version to 0.0.47 2024-10-01 15:12:20 -07:00
ac46b0e4d0 web: change up spacing in email headers. Increase density 2024-10-01 15:12:02 -07:00
e12ea2d7e4 Bumping version to 0.0.46 2024-09-29 19:17:07 -07:00
5f052facdf web: fix styling of envelope on closed headers 2024-09-29 19:16:51 -07:00
4476749203 Bumping version to 0.0.45 2024-09-29 19:05:59 -07:00
0fa860bc71 web: show email address when now name present 2024-09-29 19:05:46 -07:00
b858b23584 Bumping version to 0.0.44 2024-09-29 18:03:05 -07:00
6500e60c40 web: remove dead code 2024-09-29 18:02:45 -07:00
efc991923d Bumping version to 0.0.43 2024-09-29 17:56:39 -07:00
0b5e057fe6 web: fix spacing when there are few To/CC 2024-09-29 17:56:25 -07:00
822e1b0a9c Bumping version to 0.0.42 2024-09-29 17:15:57 -07:00
4f21814be0 web: successfully rewrite some bits in tailwind 2024-09-29 17:15:28 -07:00
17da489229 web: WIP tailwind integration 2024-09-29 16:43:29 -07:00
5b8639b80f Bumping version to 0.0.41 2024-09-29 16:41:36 -07:00
6c9ef912e6 server: don't touch tantivy if no uids reindexed 2024-09-29 16:41:13 -07:00
da636ca1f3 Bumping version to 0.0.40 2024-09-29 16:28:37 -07:00
7880eddccd Bumping version to 0.0.39 2024-09-29 16:28:25 -07:00
3ec1741f10 web & server: using tantivy for news post search 2024-09-29 16:28:05 -07:00
f36d1e0c29 server: continue if db path missing on create_news_db 2024-09-28 12:29:12 -07:00
ebf32a9905 server: WIP tantivy integration 2024-09-28 12:29:12 -07:00
005a457348 Bumping version to 0.0.38 2024-09-28 12:28:53 -07:00
a89a279764 notmuch: use faster, but inaccurate message count 2024-09-28 12:28:41 -07:00
fbc426f218 Bumping version to 0.0.37 2024-09-28 12:23:29 -07:00
27b480e118 web: try alternative for clearing screen on build 2024-09-28 12:22:35 -07:00
dee6ff9ba0 Bumping version to 0.0.36 2024-09-28 12:06:12 -07:00
73bdcd5441 server: add pjpeg support for attachments 2024-09-28 12:06:00 -07:00
64a38e024d Bumping version to 0.0.35 2024-09-28 11:18:39 -07:00
441b40532f Bumping version to 0.0.34 2024-09-28 11:18:37 -07:00
bfb6a6226d Bumping version to 0.0.33 2024-09-28 11:18:37 -07:00
f464585fad web: tweak hr styling 2024-09-28 11:18:37 -07:00
3fe61f8b09 web: clear screen on rebuild 2024-09-28 11:18:37 -07:00
43b3625656 server: join slurped parts with <hr> elements 2024-09-28 11:16:10 -07:00
6505c90f32 Bumping version to 0.0.32 2024-09-26 16:28:02 -07:00
104eb189fe web: shrink <hr> margins 2024-09-26 16:27:50 -07:00
b70e0018d7 Bumping version to 0.0.31 2024-09-25 19:46:15 -07:00
d962d515f5 web: shorten outbound link on news post 2024-09-25 19:45:52 -07:00
3c8d7d4f81 server: move tantivy code to separate mod 2024-09-22 10:26:45 -07:00
d1604f8e70 server: remove done TODO 2024-09-21 18:48:25 -07:00
6f07817c0e Bumping version to 0.0.30 2024-09-21 13:01:27 -07:00
0ac959ab76 server: add slurp config for ingowald 2024-09-21 13:01:17 -07:00
62b17bd6a6 Bumping version to 0.0.29 2024-09-20 08:56:58 -07:00
c0bac99d5a server: add slurp config for zsa blog 2024-09-20 08:56:45 -07:00
3b69c5e74b Bumping version to 0.0.28 2024-09-19 17:06:03 -07:00
539fd469cc server: create index when missing 2024-09-19 17:05:47 -07:00
442688c35c web: lint 2024-09-19 16:54:18 -07:00
da27f02237 Bumping version to 0.0.27 2024-09-19 16:52:35 -07:00
9460e354b7 server: cargo sqlx prepare 2024-09-19 16:52:26 -07:00
6bab128ed9 Bumping version to 0.0.26 2024-09-19 16:33:50 -07:00
3856b4ca5a server: try different cacher url 2024-09-19 16:33:40 -07:00
bef39eefa5 Bumping version to 0.0.25 2024-09-19 16:08:20 -07:00
b0366c7b4d server: try non-https to see if that works 2024-09-19 16:07:59 -07:00
ca02d84d63 Bumping version to 0.0.24 2024-09-19 16:01:55 -07:00
461d5de886 server: change internal git url 2024-09-19 16:01:41 -07:00
f8134dad7a Bumping version to 0.0.23 2024-09-19 15:53:56 -07:00
30f510bb03 server: WIP tantivy, cache slurps, use shared::compute_color, 2024-09-19 15:53:09 -07:00
e7cbf9cc45 shared: remove debug logging 2024-09-19 13:54:47 -07:00
5108213af5 web: use shared compute_color 2024-09-19 13:49:24 -07:00
d148f625ac shared: add compute_color 2024-09-19 13:48:56 -07:00
a9b8f5a88f Bumping version to 0.0.22 2024-09-16 20:00:16 -07:00
539b584d9b web: fix broken build 2024-09-16 20:00:06 -07:00
2f8d83fc4b Bumping version to 0.0.21 2024-09-16 19:52:28 -07:00
86ee1257fa web: better progress bar 2024-09-16 19:52:20 -07:00
03f1035e0e Bumping version to 0.0.20 2024-09-12 22:38:18 -07:00
bd578191a8 web: add scroll to top button and squelch some debug logging 2024-09-12 22:37:58 -07:00
d4fc2e2ef1 Bumping version to 0.0.19 2024-09-12 15:41:01 -07:00
cde30de81c web: explicitly set progress to zero when not in thread/news view 2024-09-12 15:40:42 -07:00
96be74e3ee Bumping version to 0.0.18 2024-09-12 15:32:30 -07:00
b78d34b27e web: disable bulma styling for .number 2024-09-12 15:32:18 -07:00
b4b64c33a6 Bumping version to 0.0.17 2024-09-12 10:07:00 -07:00
47b1875022 server: tweak cloudflare and prusa slurp config 2024-09-12 10:06:46 -07:00
b06cbd1381 Bumping version to 0.0.16 2024-09-12 10:03:26 -07:00
9e35f8ca6c web: fix <em> looking like a button 2024-09-12 10:01:58 -07:00
8eaefde67d Bumping version to 0.0.15 2024-09-12 09:28:14 -07:00
d5a3324837 server: slurp config for prusa blog and squelch some info logging 2024-09-12 09:27:57 -07:00
f5c90d8770 Bumping version to 0.0.14 2024-09-11 11:46:04 -07:00
825a125a62 web: redox specific styling 2024-09-11 11:45:53 -07:00
da7cf37dae Bumping version to 0.0.13 2024-09-11 11:41:27 -07:00
1985ae1f49 server: add slurp configs for facebook and redox 2024-09-11 11:41:09 -07:00
91eb3019f9 Bumping version to 0.0.12 2024-09-09 20:31:07 -07:00
66e8e00a9b web: remove dead code 2024-09-09 20:21:51 -07:00
4b8923d852 web: more accurate reading progress bar 2024-09-09 20:21:13 -07:00
baba720749 Bumping version to 0.0.11 2024-09-02 13:36:18 -07:00
1ec22599cc web: make pre blocks look like code blocks in email 2024-09-02 13:35:58 -07:00
c69017bc36 Bumping version to 0.0.10 2024-09-02 13:19:11 -07:00
48bf57fbbe web: more pleasant color scheme for code blocks in email 2024-09-02 13:18:49 -07:00
3491856784 Bumping version to 0.0.9 2024-09-01 16:17:35 -07:00
f887c15b46 web: address lint 2024-09-01 16:17:27 -07:00
7786f850d1 Bumping version to 0.0.8 2024-09-01 16:16:09 -07:00
cad778734e web: rename Msg::Reload->Refresh and create proper Reload 2024-09-01 16:15:38 -07:00
1210f7038a Bumping version to 0.0.7 2024-09-01 16:09:14 -07:00
f9ab7284a3 web: remove obsolete Makefile 2024-09-01 16:09:04 -07:00
100865c923 server: use same html cleanup idiom in nm as we do in newreader 2024-09-01 16:08:25 -07:00
b8c1710a83 dev: watch for git commits and rebuild on change 2024-09-01 16:07:22 -07:00
215b8cd41d shared: ignore dirty, if git is present we're developing
When developing dirty can get out of between client and server if you're
only doing development in one.
2024-09-01 15:57:02 -07:00
487d7084c3 Bumping version to 0.0.6 2024-09-01 15:48:41 -07:00
b1e761b26f web: don't show progress bar until 400px have scrolled 2024-09-01 15:48:11 -07:00
3efe90ca21 Update release makefile 2024-09-01 15:40:19 -07:00
61649e1e04 Bumping version to 0.0.5 2024-09-01 15:38:39 -07:00
13ac352a10 Helpers to bump version number 2024-09-01 15:37:00 -07:00
5ca7a25e8d Bumping version to 0.0.4 2024-09-01 15:36:48 -07:00
7bb8ef0938 Bumping version to :?} 2024-09-01 15:36:36 -07:00
5c55a290ac Bumping version to :?} 2024-09-01 15:34:53 -07:00
4e3e1b075d Setting crate version to 0.2.0-a8c5a16 2024-09-01 15:30:37 -07:00
a8c5a164ff web: clean up version string and reload on mismatch 2024-09-01 15:02:34 -07:00
1f393f1c7f Add server and client build versions 2024-09-01 14:55:51 -07:00
fdaff70231 server: improve cloudflare and grafana image and iframe rendering 2024-09-01 11:05:07 -07:00
7218c13b9e server: address lint 2024-08-31 16:18:47 -07:00
934cb9d91b web: address lint 2024-08-31 16:11:49 -07:00
4faef5e017 web: add scrollbar for read progress 2024-08-31 16:08:06 -07:00
5c813e7350 web: style improvements for figure captions 2024-08-31 15:04:19 -07:00
fb754469ce web: let pullquotes on grafana blog be full width 2024-08-31 14:46:38 -07:00
548b5a0ab0 server: extract image title and alt attributes into figure captions 2024-08-31 14:43:04 -07:00
f77d0776c4 web: style tweaks for <em> 2024-08-31 14:42:19 -07:00
e73f70af8f Fix new post read/unread handling 2024-08-31 13:49:03 -07:00
a9e6120f81 web: don't make slashdot pull quotes italic 2024-08-31 13:36:21 -07:00
090a010a63 server: fix thread id for news posts 2024-08-31 13:23:25 -07:00
85c762a297 web: add class for mail vs news-post bodies 2024-08-31 11:54:19 -07:00
a8d5617cf2 Treat email and news posts as distinct types on the frontend and backend 2024-08-31 11:40:06 -07:00
760cec01a8 Refactor thread responses into an enum.
Lays ground work for different types of views, i.e. email, news, docs, etc.
2024-08-26 21:48:53 -07:00
446fcfe37f server: fix url for graphiql 2024-08-26 21:48:25 -07:00
71de3ef8ae server: add ability to slurp contents from site 2024-08-25 19:37:53 -07:00
d98d429b5c notmuch: add TODO 2024-08-25 19:37:37 -07:00
cf5a6fadfd server: sort dependencies 2024-08-24 09:26:52 -07:00
9a078cd238 server: only add "view on site" link if it's not in the html body 2024-08-19 10:57:09 -07:00
a81a803cca server: include default chrome CSS as a baseline for news threads 2024-08-19 10:47:38 -07:00
816587b688 server: fix download of chrome default CSS 2024-08-19 10:47:14 -07:00
4083c58bbd server: add chrome default styles
From:
https://source.chromium.org/chromium/chromium/src/+/main:third_party/blink/renderer/core/html/resources/html.css
2024-08-19 10:31:59 -07:00
8769e5acd4 server: fix counting issue w/ notmuch (messages vs threads) 2024-08-18 14:18:15 -07:00
3edf9fdb5d web: fix age display when less than 1 minute 2024-08-18 12:55:39 -07:00
ac0ce29c76 web: preserve checked boxes on search refresh 2024-08-18 11:04:31 -07:00
5279578c64 server: fix inline image loading 2024-08-17 16:33:53 -07:00
632f64261e server: fix notmuch paging bug 2024-08-15 16:21:27 -07:00
b5e25eef78 server: fix paging if only notmuch results are found 2024-08-15 14:58:23 -07:00
8a237bf8e1 server: add link to news posts back to original article 2024-08-12 21:14:32 -07:00
c5def6c0e3 web: allow clicking anywhere in the subject line in search results 2024-08-12 20:54:16 -07:00
d1cfc77148 server: more news title/body cleanup, and don't search news so much 2024-08-12 20:53:48 -07:00
c314e3c798 web: make whole row of search results clickable
No longer allow searching by tag by clicking on chiclet
2024-08-06 21:37:38 -07:00
7c5ef96ff0 server: fix paging bug where p1->p2->p1 wouldn't show consistent results 2024-08-06 21:15:10 -07:00
474cf38180 server: cargo sqlx prepare 2024-08-06 20:55:05 -07:00
e81a452dfb web: scroll to top when viewing a new tag 2024-08-06 20:54:25 -07:00
e570202ba2 Merge news and email search results 2024-08-06 20:44:25 -07:00
a84c9f0eaf server: address some lint 2024-08-05 15:54:26 -07:00
530bd8e350 Inline mvp and custom override CSS when rendering RSS posts 2024-08-05 15:47:31 -07:00
359e798cfa server: going with mvp.css not normalize.css 2024-08-04 21:23:05 -07:00
d7d257a6b5 https://andybrewer.github.io/mvp/mvp.css 2024-08-04 21:22:34 -07:00
9ad9ff6879 https://necolas.github.io/normalize.css/8.0.1/normalize.css 2024-08-03 21:31:09 -07:00
56bc1cf7ed server: escape RSS feeds that are HTML escaped 2024-08-03 11:29:20 -07:00
e0863ac085 web: more robust avatar intial filtering 2024-07-29 17:29:15 -07:00
d5fa89b38c web: show tag list in all modalities. WIP 2024-07-29 08:48:44 -07:00
605af13a37 web: monospace font for plain text emails 2024-07-29 08:32:28 -07:00
3838cbd6e2 cargo fix 2024-07-24 11:08:47 -07:00
c76df0ef90 web: update copy icon in more places 2024-07-24 11:06:38 -07:00
cd77d302df web: small icon tweak for copying email addresses 2024-07-24 11:03:32 -07:00
71348d562d version bump 2024-07-24 11:03:26 -07:00
b6ae46db93 Move cargo config up a directory 2024-07-22 16:56:13 -07:00
6cb84054ed Only build server by default 2024-07-22 16:48:47 -07:00
7b511c1673 Fix cleanhtml build 2024-07-22 16:41:14 -07:00
bfd5e12bea Make URL joining more robust 2024-07-22 16:39:59 -07:00
ad8fb77857 Add copy to clipboard links to from/to/cc addresses 2024-07-22 16:04:25 -07:00
831466ddda Add mark read/unread support for news 2024-07-22 14:43:05 -07:00
4ee34444ae Move thread: and id: prefixing to server side.
This paves way for better news: support
2024-07-22 14:26:48 -07:00
879ddb112e Remove some logging and fix a comment 2024-07-22 14:26:24 -07:00
331fb4f11b Fix build 2024-07-22 12:19:45 -07:00
4e5275ca0e cargo sqlx prepare 2024-07-22 12:19:38 -07:00
1106377550 Normalize links and images based on post's URL 2024-07-22 11:27:15 -07:00
b5468bced2 Implement pagination for newsreader 2024-07-22 09:28:12 -07:00
01cbe6c037 web: set reasonable defaults on front page requests 2024-07-22 08:28:12 -07:00
d0a02c2f61 cargo fix lint 2024-07-22 08:19:07 -07:00
c499672dde Rollback attempt to make unread tag queries faster for newsreader 2024-07-22 08:17:46 -07:00
3aa0b94db4 Fix bug in pagination when more than SEARCH_RESULTS_PER_PAGE returned 2024-07-22 08:13:45 -07:00
cdb64ed952 Remove old search URLs 2024-07-22 07:25:15 -07:00
834efc5c94 Handle needs_unread on tag query. Move News to top of tag list 2024-07-22 07:24:28 -07:00
79db94f67f Add pretty site names to search and thread views 2024-07-21 20:50:50 -07:00
ec41f840d5 Store remaining text when parsing query 2024-07-21 15:19:19 -07:00
d9d57c66f8 Sort by title on date tie breaker 2024-07-21 15:18:31 -07:00
9746c9912b Implement newsreader counting 2024-07-21 15:13:09 -07:00
abaaddae3a Implement unread filtering on threads 2024-07-21 15:12:32 -07:00
0bf64004ff server: order tags alphabetically 2024-07-21 13:09:08 -07:00
6fae9cd018 WIP basic news thread rendering 2024-07-21 12:50:21 -07:00
65fcbd4b77 WIP move thread loading for notmuch into nm mod 2024-07-21 09:31:37 -07:00
dd09bc3168 WIP add search 2024-07-21 09:05:03 -07:00
0bf865fdef WIP reading news from app 2024-07-21 07:53:02 -07:00
5c0c45b99f Revert "Make blockquotes fancier"
This reverts commit 221f046664.
2024-07-13 15:21:59 -07:00
221f046664 Make blockquotes fancier 2024-07-13 09:19:52 -07:00
2a9d5b393e Use default styling for lists. 2024-07-13 09:02:35 -07:00
90860e5511 Remove profile from workspace config 2024-07-13 09:02:19 -07:00
0b1f806276 web: visualize blockquote better 2024-07-12 07:44:31 -07:00
0482713241 address cargo udeps 2024-07-07 15:06:04 -07:00
bb3e18519f cargo update 2024-07-07 14:59:10 -07:00
3a4d08facc web: lint 2024-07-07 14:43:58 -07:00
30064d5904 server: fix broken open-link-in-new-tab from recent changes 2024-07-07 14:40:37 -07:00
c288b7fd67 Disable running test 2024-07-06 18:47:55 -07:00
b4d1528612 web: migrate from lib->bin 2024-07-06 18:18:28 -07:00
5fc272054c Put all URLs under /api/ 2024-07-05 20:00:52 -07:00
714e73aeb1 Address a bunch of lint 2024-07-05 10:44:37 -07:00
3dfd2d48b3 Fix compile error 2024-07-05 10:40:14 -07:00
3a5a9bd66a Add support for inline images 2024-07-05 10:38:12 -07:00
55d7aec516 server: handle multipart/related with a multipart/alternative embedded 2024-05-05 19:03:38 -07:00
96d3e4a7d6 Version bump 2024-05-02 09:30:11 -07:00
beb96aba14 web: fix inverted boolean on spam shortcut 2024-04-29 21:04:56 -07:00
48f66c7096 web: when marking spam, also mark it as read 2024-04-14 08:17:36 -07:00
a96b553b08 Version bumps to get fixes to mailparse & data-encoding 2024-04-14 07:55:26 -07:00
31a3ac66b6 web: swap spam and read/unread buttons 2024-04-08 20:51:56 -07:00
a33e1f5d3c Update lock 2024-04-06 16:22:30 -07:00
423ea10d34 web: use upstream human_format 2024-04-06 16:20:15 -07:00
1b221d5c16 web&server: show raw body contents of UnhandledContentType 2024-04-06 10:21:31 -07:00
d4038f40d6 web: add UI to remove tags when viewing messages 2024-04-06 09:38:00 -07:00
dc7b3dd3e8 web: human format attachment size 2024-04-06 08:52:20 -07:00
1f5f10f78d server: properly filter inline vs attaments 2024-04-06 08:34:26 -07:00
7df11639ed web: don't show text on action icons on tablet/mobile 2024-04-06 08:10:04 -07:00
b0305b7411 web: separate spam button from read buttons and color red. 2024-04-06 08:00:35 -07:00
8abf9398e9 web: add mark as spam buttons 2024-04-03 21:10:23 -07:00
1b196a2703 server: add ability to add/remove labels 2024-04-03 21:07:06 -07:00
a24f456136 web: don't show mime type on attachment 2024-04-03 20:28:51 -07:00
d8fef54606 web: add attachment icons 2024-04-03 20:25:42 -07:00
9a5dc20f83 server: add functioning download attachment handler 2024-03-26 08:25:52 -07:00
ff1c3f5791 server: preserve class attribute on sanitized html 2024-03-26 08:25:37 -07:00
c74cd66826 server: add ability to view inline image attachments 2024-03-24 18:11:15 -07:00
c30cfec09d web: cleanup lint 2024-03-05 09:24:41 -08:00
e20e794508 web: remove mostly useless footer 2024-03-05 09:23:59 -08:00
d09efd3a69 web: overflow:auto the body so wide messages behave better 2024-03-05 09:20:31 -08:00
1ac7f5b6dc web: handle empty subjects 2024-03-05 09:04:19 -08:00
fc7a4a747c web: debug search for tag:letterbox instead of is:unread 2024-02-28 19:13:37 -08:00
facea2326e web: make from and date area clickable on search results page 2024-02-27 09:46:23 -08:00
56311bbe05 web: css cleanup for search results table 2024-02-27 09:07:49 -08:00
994631e872 web: display To/CC differently on expansion 2024-02-26 11:24:09 -08:00
43471d162f web: make empty subject line clickable 2024-02-26 11:01:20 -08:00
b997a61da8 web: better wrapping behavior for plain text messages 2024-02-24 09:14:50 -08:00
f69dd0b198 server: debug print unhandled mimetypes for some multipart messages 2024-02-23 16:55:13 -08:00
523584fbbc web: change style for attachments 2024-02-23 16:54:53 -08:00
4139ec38d8 web: add TODO about message and thread id types 2024-02-23 16:10:17 -08:00
5379ae09dc server: replace string literals in a bunch of places with consts 2024-02-23 16:09:58 -08:00
ebb16aef9e web: make mark read/unread icon target much larger 2024-02-23 07:07:20 -08:00
fc87fd702c web: refacter header rendering code, add more detail when message open 2024-02-22 21:19:09 -08:00
42484043a1 web: have colored initials for From
Add scaffolding for profile pics
2024-02-22 20:37:21 -08:00
3f268415e9 web: rework header in thread view, tweak some styles, remove some logging 2024-02-22 18:54:34 -08:00
c2a5fe19e3 web: go back to search page after changing read status 2024-02-21 17:58:12 -08:00
42ce88d931 web: add select all/partial/none for search table 2024-02-21 15:02:58 -08:00
cda99fc7a5 web: improve checkbox style on desktop 2024-02-20 20:20:50 -08:00
b33a252698 web: label read/unread icons 2024-02-20 20:16:25 -08:00
9e3ae22827 web: lint 2024-02-20 19:59:35 -08:00
5923547159 web: handle expand/collapse of messages separate from unread status 2024-02-20 19:58:50 -08:00
fe980c5468 web: lint 2024-02-20 19:25:28 -08:00
f50fe7196e web: add bulk read/unerad functionality 2024-02-20 19:24:56 -08:00
de3f392bd7 web: use bold text to indicate unread messages 2024-02-20 14:29:42 -08:00
02c0d36f90 web: remove a ton of legacy deprecated code 2024-02-20 14:13:06 -08:00
04592ddcc4 web: change up unread message styles 2024-02-20 13:55:54 -08:00
c8e0f68278 web: remove info statement 2024-02-16 19:24:16 -08:00
4957b485a0 web: add mark read button on search result page 2024-02-16 19:23:35 -08:00
7ebe517a34 web: tweak subject line style 2024-02-11 20:48:26 -08:00
516eedb086 web: add per-message unread control and display 2024-02-11 20:29:49 -08:00
ce836cd1e8 notmuch: add tag manipulation 2024-02-11 19:59:20 -08:00
f7010fa278 cargo update 2024-02-11 19:54:35 -08:00
5451dd2056 server: add mutation to mark messages as read 2024-02-11 19:43:34 -08:00
81ed3a8ca2 Linkify URLs missing schema 2024-02-07 19:41:34 -08:00
0f1a60a348 Sanitize html when linkifying plain text. 2024-02-03 11:15:57 -08:00
c59a883351 Address lint. 2024-02-03 11:14:43 -08:00
568d83f029 linkify URLs in plaintext emails. 2024-02-03 11:10:51 -08:00
569781b592 Tweak CSS for viewing body of messages 2024-01-20 08:34:25 -08:00
1b00c9e944 Updated cargo lock 2024-01-20 08:14:59 -08:00
901785e47c Change footer class to prevent conflict with email bodies. 2024-01-20 08:14:37 -08:00
8c47f01758 Improve server side html sanitization. 2024-01-20 08:14:10 -08:00
304819275d Open links in a new tab. 2024-01-19 21:07:24 -08:00
b1ea44963d Lint and cleanup empty file. 2024-01-17 12:31:56 -08:00
181965968c state: auto reload every 30 seconds 2024-01-17 12:31:37 -08:00
5b3eadb7bd Run tests before rebuilding app 2024-01-06 08:53:06 -08:00
28d484117b Change makefile to use variable for app name.
Make this more copypastable.
2024-01-06 08:52:42 -08:00
a0b0689e01 Fix wrapping/sizing of message bodies with long unbreakable text. 2024-01-06 08:52:19 -08:00
33ec63f097 web: update seed_hooks to my copy so I can pin to seed=0.10.0 2023-12-10 19:42:07 -08:00
7b22f85429 web: show union of tags when viewing thread 2023-12-10 17:26:24 -08:00
fa7df55b0e server: send tags on each message in thread 2023-12-10 17:26:04 -08:00
d2cf270dda web: properly truncate long headers on message view 2023-12-10 16:35:51 -08:00
f1b5e78962 web: make debug output hidden by default 2023-12-10 16:11:15 -08:00
fae4e43682 web: show thread count when greater than 1 2023-12-10 15:50:28 -08:00
37eb3d1dfd web: wrap content tree debug so messages aren't super wide 2023-12-07 10:24:39 -08:00
e0890f1181 web: search for unread tags when clicking under Unread section 2023-12-05 20:55:41 -08:00
c31f9d581f web: upgrade to seed-0.10.0 2023-12-05 20:46:59 -08:00
f2347345b4 Version bumps made css_inline uncompilable for wasm 2023-12-05 14:12:15 -08:00
e34f2a1f39 notmuch: fix tests 2023-12-05 12:50:52 -08:00
7a6000be26 server: address lint 2023-12-05 11:26:23 -08:00
dd1a8c2eae procmail2notmuch: WIP update script 2023-12-05 11:23:04 -08:00
42590b3cbc cargo update 2023-12-05 11:04:31 -08:00
94f7ad109a Merge commit 'f6bdf30' 2023-12-05 09:56:55 -08:00
f6bdf302fe server & notmuch: more attachment WIP, stop leaking notmuch processes 2023-12-03 14:01:18 -08:00
b76c535738 web: use log::error, not seed::error 2023-12-03 09:11:31 -08:00
29949c703d web: archive live site before pushing new one 2023-12-03 09:11:15 -08:00
f5f9eb175d server: WIP attachment serving 2023-12-03 09:11:00 -08:00
488c3b86f8 web: truncate raw messages and prep for attachments 2023-12-03 09:03:36 -08:00
be8fd59703 web: rename view_thread to take advantage of new namespaces 2023-12-03 08:49:20 -08:00
071fe2e206 web: show message-ID when viewing thread 2023-12-02 16:35:37 -08:00
ac5660a6d0 web: have trunk proxy /original/ requests to backend 2023-12-02 16:35:18 -08:00
99a104517d notmuch: comment typo 2023-12-02 16:35:05 -08:00
c3692cadec server: add id and header to ShowThreadQuery API 2023-12-02 16:34:44 -08:00
b14000952c server: make unread message counting much faster, remove rayon dep 2023-12-02 15:41:22 -08:00
7a32d5c630 server: include headers in debug output 2023-12-02 15:12:40 -08:00
714b057fdb web: add tablet rendering, listen to window resize events. 2023-12-02 10:56:14 -08:00
4c2526c70b web: remove unnecessary view_mobile_ prefix 2023-12-02 10:13:08 -08:00
a8f4aa03bd web: rename legacy functions to take advantage of mod namespacing 2023-12-02 10:11:56 -08:00
28d5562491 web: move legacy (pre-graphql) rendering to separate mod 2023-12-02 10:07:47 -08:00
e6f20e538a web: move mobile specific code to separate mod 2023-12-02 10:02:12 -08:00
970bb55c73 web: move desktop specific code into separate mod 2023-12-02 09:56:57 -08:00
12f0491455 web: remove stale comments 2023-12-02 09:34:57 -08:00
ef8362d6f2 web: remove some unused code 2023-12-02 09:34:26 -08:00
0a7cdefda3 web: refactor code into separate modules 2023-12-02 09:29:50 -08:00
cfe1446668 web: for tag list to be open when no unread messages 2023-11-29 09:27:53 -08:00
7c38962d21 web: make tag list hidable 2023-11-28 20:03:17 -08:00
7102f26c9e web: conditionally show unread section 2023-11-28 07:32:22 -08:00
71a3315fe8 web: lint and clean up search input handling 2023-11-27 21:11:12 -08:00
7cac81cddb web: update implement_email macro to handle repetition 2023-11-27 20:33:47 -08:00
3a5ca74d71 web: change tag list styling and show unread at the top 2023-11-27 19:48:19 -08:00
71af8179ec web: hierarchical tags list on desktop 2023-11-27 19:16:28 -08:00
d66a7d3d53 web: use singular version of view_address for From 2023-11-27 17:20:11 -08:00
e0fbb0253e web: create implement_email! macro 2023-11-27 17:16:57 -08:00
48466808d3 web & server: plumb debugging info for content type hierarchy.
Also cleanup Email trait.
2023-11-27 13:47:02 -08:00
87dfe4ace7 server: cleanup lint. 2023-11-26 21:31:06 -08:00
d45f223d52 server: fix pagination with small counts and no first/last set 2023-11-26 21:27:57 -08:00
e8c58bdbd0 server: handle multipart/mixed with an html or text subpart 2023-11-26 21:09:56 -08:00
87d687cde5 server: sanitize html using ammonia 2023-11-26 21:00:44 -08:00
c8147ded60 web & server: add handling for google calendar and wellsfargo emails. 2023-11-26 20:51:53 -08:00
1261bdf8a9 web & server: improved debug printing of unhandled mime types 2023-11-26 18:50:32 -08:00
11366b6fac web & server: implement handling for text and html bodies. 2023-11-26 16:37:29 -08:00
1cdabc348b web: better date formatting 2023-11-26 16:01:22 -08:00
02e16b4547 web: more compact output on desktop and mobile 2023-11-26 15:46:03 -08:00
d5a001bf03 web: refresh tags on thread view in addition to search results. 2023-11-26 15:31:51 -08:00
0ae72b63d0 web: add basic graphql view thread, no body support. 2023-11-26 15:27:19 -08:00
447a4a3387 server: basic graphql thread show, no body support yet. 2023-11-26 13:13:04 -08:00
0737f5aac5 web: rewrite frontend to use graphql for search results 2023-11-25 09:06:24 -08:00
3e3024dd5c server: handle search with no first/last better 2023-11-25 09:05:53 -08:00
24414b04bb server: fix backward pagination 2023-11-25 08:39:56 -08:00
f7df834325 notmuch: default empty search to wildcard 2023-11-25 08:39:30 -08:00
bce2c741c4 web: add non-functional graphql. 2023-11-21 14:06:48 -08:00
1b44bc57bb web: Initial commit of graphql schema and helper to update it. 2023-11-21 13:36:11 -08:00
ff6675b08f server: add unread field to tag query.
Optionally fill out unread, as it's expensive.
2023-11-21 13:17:11 -08:00
64912be4eb Hide quoted emails 2023-11-21 12:37:58 -08:00
57ccef18cb Make clicking search results on mobile easier. 2023-11-21 12:27:58 -08:00
2a24a20529 Revert stub show_pretty that will be obsoleted by graphql. 2023-11-21 08:35:35 -08:00
e6692059b4 Fix search pagination and add count RPC. 2023-11-20 21:18:40 -08:00
a7b172099b And graphql search with pagination. 2023-11-20 20:56:16 -08:00
f52a76dba3 Added graphql endpoint and tested with tags implementation. 2023-11-20 18:38:10 -08:00
43e4334890 Set default page size on server to match client side page size. 2023-11-20 17:57:07 -08:00
1d00bdb757 Squelch logging and remove unused variable. 2023-11-20 17:54:50 -08:00
6901c9fde9 Formate today and yesterday better. 2023-11-20 17:53:49 -08:00
6251c54873 Show time of email >1 week 2023-11-20 17:47:06 -08:00
f6c1835b18 Custom formatting of the age string, widen subject column. 2023-11-20 17:41:58 -08:00
95976c2860 Mobile style tweaks. 2023-11-20 15:49:30 -08:00
01589d7136 Add favicon 2023-11-20 15:40:07 -08:00
a2664473c8 Improve density on mobile. 2023-11-14 21:33:09 -08:00
123 changed files with 23599 additions and 2814 deletions

9
.cargo/config.toml Normal file
View File

@@ -0,0 +1,9 @@
[build]
rustflags = [ "--cfg=web_sys_unstable_apis" ]
[registry]
global-credential-providers = ["cargo:token"]
[registries.xinu]
index = "sparse+https://git.z.xinu.tv/api/packages/wathiede/cargo/"

10
.envrc Normal file
View File

@@ -0,0 +1,10 @@
source_up
export DATABASE_USER="newsreader";
export DATABASE_NAME="newsreader";
export DATABASE_HOST="nixos-07.h.xinu.tv";
export DATABASE_URL="postgres://${DATABASE_USER}@${DATABASE_HOST}/${DATABASE_NAME}";
export PROD_DATABASE_USER="newsreader";
export PROD_DATABASE_NAME="newsreader";
export PROD_DATABASE_HOST="postgres.h.xinu.tv";
export PROD_DATABASE_URL="postgres://${PROD_DATABASE_USER}@${PROD_DATABASE_HOST}/${PROD_DATABASE_NAME}";

67
.gitea/workflows/rust.yml Normal file
View File

@@ -0,0 +1,67 @@
on: [push]
name: Continuous integration
jobs:
check:
name: Check
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions-rust-lang/setup-rust-toolchain@v1
- run: cargo check
test:
name: Test Suite
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions-rust-lang/setup-rust-toolchain@v1
- run: cargo test
trunk:
name: Trunk
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions-rust-lang/setup-rust-toolchain@v1
with:
toolchain: nightly
target: wasm32-unknown-unknown
- run: cargo install trunk
- run: cd web; trunk build
fmt:
name: Rustfmt
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions-rust-lang/setup-rust-toolchain@v1
with:
components: rustfmt
- name: Rustfmt Check
uses: actions-rust-lang/rustfmt@v1
build:
name: build
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions-rust-lang/setup-rust-toolchain@v1
- run: cargo build
udeps:
name: Disallow unused dependencies
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions-rust-lang/setup-rust-toolchain@v1
with:
toolchain: nightly
- name: Run cargo-udeps
uses: aig787/cargo-udeps-action@v1
with:
version: 'latest'
args: '--all-targets'

7345
Cargo.lock generated

File diff suppressed because it is too large Load Diff

View File

@@ -1,12 +1,18 @@
[workspace]
resolver = "2"
members = [
"web",
"server",
"notmuch",
"procmail2notmuch",
"shared"
]
default-members = ["server"]
members = ["web", "server", "notmuch", "procmail2notmuch", "shared"]
[workspace.package]
authors = ["Bill Thiede <git@xinu.tv>"]
edition = "2021"
license = "UNLICENSED"
publish = ["xinu"]
version = "0.17.40"
repository = "https://git.z.xinu.tv/wathiede/letterbox"
[profile.dev]
opt-level = 1
[profile.release]
lto = true

19
Justfile Normal file
View File

@@ -0,0 +1,19 @@
export CARGO_INCREMENTAL := "0"
export RUSTFLAGS := "-D warnings"
default:
@echo "Run: just patch|minor|major"
major: (_release "major")
minor: (_release "minor")
patch: (_release "patch")
sqlx-prepare:
cd server; cargo sqlx prepare && git add .sqlx; git commit -m "cargo sqlx prepare" .sqlx || true
pull:
git pull
_release level: pull sqlx-prepare
cargo-release release -x {{ level }} --workspace --no-confirm --registry=xinu

7
Makefile Normal file
View File

@@ -0,0 +1,7 @@
.PHONEY: release
release:
(cd server; cargo sqlx prepare && git add .sqlx; git commit -m "cargo sqlx prepare" .sqlx || true)
bash scripts/update-crate-version.sh
git push
all: release

4
dev.sh
View File

@@ -1,7 +1,7 @@
cd -- "$( dirname -- "${BASH_SOURCE[0]}" )"
tmux new-session -d -s letterbox-dev
tmux rename-window web
tmux send-keys "cd web; trunk serve -w ../shared -w ../notmuch -w ./" C-m
tmux send-keys "cd web; trunk serve -w ../.git -w ../shared -w ../notmuch -w ./" C-m
tmux new-window -n server
tmux send-keys "cd server; cargo watch -x run -w ../shared -w ../notmuch -w ./" C-m
tmux send-keys "cd server; cargo watch -c -w ../.git -w ../shared -w ../notmuch -w ./ -x 'run postgres://newsreader@nixos-07.h.xinu.tv/newsreader ../target/database/newsreader /tmp/letterbox/slurp'" C-m
tmux attach -d -t letterbox-dev

View File

@@ -1,17 +1,24 @@
[package]
name = "notmuch"
version = "0.1.0"
edition = "2021"
name = "letterbox-notmuch"
exclude = ["/testdata"]
description = "Wrapper for calling notmuch cli"
authors.workspace = true
edition.workspace = true
license.workspace = true
publish.workspace = true
repository.workspace = true
version.workspace = true
# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html
[dependencies]
log = "0.4.14"
log = "0.4.27"
mailparse = "0.16.1"
serde = { version = "1.0", features = ["derive"] }
serde_json = { version = "1.0", features = ["unbounded_depth"] }
thiserror = "1.0.30"
thiserror = "2.0.12"
tracing = "0.1.41"
[dev-dependencies]
itertools = "0.10.1"
itertools = "0.14.0"
pretty_assertions = "1"
rayon = "1.5"
rayon = "1.10"

View File

@@ -207,14 +207,15 @@
//! ```
use std::{
collections::HashMap,
ffi::OsStr,
io::{self, BufRead, BufReader, Lines},
io::{self},
path::{Path, PathBuf},
process::{Child, ChildStdout, Command, Stdio},
process::Command,
};
use log::info;
use serde::{Deserialize, Serialize};
use tracing::{error, info, instrument, warn};
/// # Number of seconds since the Epoch
pub type UnixTime = isize;
@@ -269,6 +270,12 @@ pub struct Headers {
#[serde(skip_serializing_if = "Option::is_none")]
pub bcc: Option<String>,
#[serde(skip_serializing_if = "Option::is_none")]
#[serde(alias = "Delivered-To")]
pub delivered_to: Option<String>,
#[serde(skip_serializing_if = "Option::is_none")]
#[serde(alias = "X-Original-To")]
pub x_original_to: Option<String>,
#[serde(skip_serializing_if = "Option::is_none")]
pub reply_to: Option<String>,
pub date: String,
}
@@ -458,13 +465,17 @@ pub enum NotmuchError {
StringUtf8Error(#[from] std::string::FromUtf8Error),
#[error("failed to parse str as int")]
ParseIntError(#[from] std::num::ParseIntError),
#[error("failed to parse mail: {0}")]
MailParseError(#[from] mailparse::MailParseError),
}
#[derive(Default)]
#[derive(Clone, Default)]
pub struct Notmuch {
config_path: Option<PathBuf>,
}
// TODO: rewrite to use tokio::process::Command and make everything async to see if that helps with
// concurrency being more parallel.
impl Notmuch {
pub fn with_config<P: AsRef<Path>>(config_path: P) -> Notmuch {
Notmuch {
@@ -472,6 +483,7 @@ impl Notmuch {
}
}
#[instrument(skip_all)]
pub fn new(&self) -> Result<Vec<u8>, NotmuchError> {
self.run_notmuch(["new"])
}
@@ -480,38 +492,88 @@ impl Notmuch {
self.run_notmuch(std::iter::empty::<&str>())
}
#[instrument(skip_all, fields(query=query))]
pub fn tags_for_query(&self, query: &str) -> Result<Vec<String>, NotmuchError> {
let res = self.run_notmuch(["search", "--format=json", "--output=tags", query])?;
Ok(serde_json::from_slice(&res)?)
}
pub fn tags(&self) -> Result<Vec<String>, NotmuchError> {
self.tags_for_query("*")
}
pub fn tag_add(&self, tag: &str, search_term: &str) -> Result<(), NotmuchError> {
self.tags_add(tag, &[search_term])
}
#[instrument(skip_all, fields(tag=tag,search_term=?search_term))]
pub fn tags_add(&self, tag: &str, search_term: &[&str]) -> Result<(), NotmuchError> {
let tag = format!("+{tag}");
let mut args = vec!["tag", &tag];
args.extend(search_term);
self.run_notmuch(&args)?;
Ok(())
}
pub fn tag_remove(&self, tag: &str, search_term: &str) -> Result<(), NotmuchError> {
self.tags_remove(tag, &[search_term])
}
#[instrument(skip_all, fields(tag=tag,search_term=?search_term))]
pub fn tags_remove(&self, tag: &str, search_term: &[&str]) -> Result<(), NotmuchError> {
let tag = format!("-{tag}");
let mut args = vec!["tag", &tag];
args.extend(search_term);
self.run_notmuch(&args)?;
Ok(())
}
#[instrument(skip_all, fields(query=query,offset=offset,limit=limit))]
pub fn search(
&self,
query: &str,
offset: usize,
limit: usize,
) -> Result<SearchSummary, NotmuchError> {
let res = self.run_notmuch([
"search",
"--format=json",
&format!("--offset={offset}"),
&format!("--limit={limit}"),
query,
])?;
Ok(serde_json::from_slice(&res)?)
let query = if query.is_empty() { "*" } else { query };
let res = self
.run_notmuch([
"search",
"--format=json",
&format!("--offset={offset}"),
&format!("--limit={limit}"),
query,
])
.inspect_err(|err| error!("failed to notmuch search for query '{query}': {err}"))?;
Ok(serde_json::from_slice(&res).unwrap_or_else(|err| {
error!("failed to decode search result for query '{query}': {err}");
SearchSummary(Vec::new())
}))
}
#[instrument(skip_all, fields(query=query))]
pub fn count(&self, query: &str) -> Result<usize, NotmuchError> {
// NOTE: --output=threads is technically more correct, but really slow
// TODO: find a fast thread count path
// let res = self.run_notmuch(["count", "--output=threads", query])?;
let res = self.run_notmuch(["count", query])?;
// Strip '\n' from res.
let s = std::str::from_utf8(&res[..res.len() - 1])?;
Ok(s.parse()?)
let s = std::str::from_utf8(&res)?.trim();
Ok(s.parse()
.inspect_err(|err| error!("failed to parse count for query '{query}': {err}"))
.unwrap_or(0))
}
#[instrument(skip_all, fields(query=query))]
pub fn show(&self, query: &str) -> Result<ThreadSet, NotmuchError> {
let slice = self.run_notmuch([
"show",
"--include-html=true",
"--entire-thread=true",
"--entire-thread=false",
"--format=json",
query,
])?;
// Notmuch returns JSON with invalid unicode. So we lossy convert it to a string here an
// Notmuch returns JSON with invalid unicode. So we lossy convert it to a string here and
// use that for parsing in rust.
let s = String::from_utf8_lossy(&slice);
let mut deserializer = serde_json::Deserializer::from_str(&s);
@@ -521,6 +583,7 @@ impl Notmuch {
Ok(val)
}
#[instrument(skip_all, fields(query=query,part=part))]
pub fn show_part(&self, query: &str, part: usize) -> Result<Part, NotmuchError> {
let slice = self.run_notmuch([
"show",
@@ -530,7 +593,7 @@ impl Notmuch {
&format!("--part={}", part),
query,
])?;
// Notmuch returns JSON with invalid unicode. So we lossy convert it to a string here an
// Notmuch returns JSON with invalid unicode. So we lossy convert it to a string here and
// use that for parsing in rust.
let s = String::from_utf8_lossy(&slice);
let mut deserializer = serde_json::Deserializer::from_str(&s);
@@ -540,21 +603,107 @@ impl Notmuch {
Ok(val)
}
#[instrument(skip_all, fields(id=id))]
pub fn show_original(&self, id: &MessageId) -> Result<Vec<u8>, NotmuchError> {
self.show_original_part(id, 0)
}
#[instrument(skip_all, fields(id=id,part=part))]
pub fn show_original_part(&self, id: &MessageId, part: usize) -> Result<Vec<u8>, NotmuchError> {
let id = if id.starts_with("id:") {
id
} else {
&format!("id:{id}")
};
let res = self.run_notmuch(["show", "--part", &part.to_string(), id])?;
Ok(res)
}
pub fn message_ids(&self, query: &str) -> Result<Lines<BufReader<ChildStdout>>, NotmuchError> {
let mut child = self.run_notmuch_pipe(["search", "--output=messages", query])?;
Ok(BufReader::new(child.stdout.take().unwrap()).lines())
#[instrument(skip_all, fields(query=query))]
pub fn message_ids(&self, query: &str) -> Result<Vec<String>, NotmuchError> {
let res = self.run_notmuch(["search", "--output=messages", "--format=json", query])?;
Ok(serde_json::from_slice(&res)?)
}
// TODO(wathiede): implement tags() based on "notmuch search --output=tags '*'"
#[instrument(skip_all, fields(query=query))]
pub fn files(&self, query: &str) -> Result<Vec<String>, NotmuchError> {
let res = self.run_notmuch(["search", "--output=files", "--format=json", query])?;
Ok(serde_json::from_slice(&res)?)
}
#[instrument(skip_all)]
pub fn unread_recipients(&self) -> Result<HashMap<String, usize>, NotmuchError> {
let slice = self.run_notmuch([
"show",
"--include-html=false",
"--entire-thread=false",
"--body=false",
"--format=json",
// Arbitrary limit to prevent too much work
"--limit=1000",
"is:unread",
])?;
// Notmuch returns JSON with invalid unicode. So we lossy convert it to a string here and
// use that for parsing in rust.
let s = String::from_utf8_lossy(&slice);
let mut deserializer = serde_json::Deserializer::from_str(&s);
deserializer.disable_recursion_limit();
let ts: ThreadSet = serde::de::Deserialize::deserialize(&mut deserializer)?;
deserializer.end()?;
let mut r = HashMap::new();
fn collect_from_thread_node(
r: &mut HashMap<String, usize>,
tn: &ThreadNode,
) -> Result<(), NotmuchError> {
let Some(msg) = &tn.0 else {
return Ok(());
};
let mut addrs = vec![];
let hdr = &msg.headers.to;
if let Some(to) = hdr {
addrs.push(to);
} else {
let hdr = &msg.headers.x_original_to;
if let Some(to) = hdr {
addrs.push(to);
} else {
let hdr = &msg.headers.delivered_to;
if let Some(to) = hdr {
addrs.push(to);
};
};
};
let hdr = &msg.headers.cc;
if let Some(cc) = hdr {
addrs.push(cc);
};
for recipient in addrs {
mailparse::addrparse(&recipient)?
.into_inner()
.iter()
.for_each(|a| {
let mailparse::MailAddr::Single(si) = a else {
return;
};
let addr = &si.addr;
if addr == "couchmoney@gmail.com" || addr.ends_with("@xinu.tv") {
*r.entry(addr.to_lowercase()).or_default() += 1;
}
});
}
Ok(())
}
for t in ts.0 {
for tn in t.0 {
collect_from_thread_node(&mut r, &tn)?;
for sub_tn in tn.1 {
collect_from_thread_node(&mut r, &sub_tn)?;
}
}
}
Ok(r)
}
fn run_notmuch<I, S>(&self, args: I) -> Result<Vec<u8>, NotmuchError>
where
@@ -568,22 +717,14 @@ impl Notmuch {
cmd.args(args);
info!("{:?}", &cmd);
let out = cmd.output()?;
Ok(out.stdout)
}
fn run_notmuch_pipe<I, S>(&self, args: I) -> Result<Child, NotmuchError>
where
I: IntoIterator<Item = S>,
S: AsRef<OsStr>,
{
let mut cmd = Command::new("notmuch");
if let Some(config_path) = &self.config_path {
cmd.arg("--config").arg(config_path);
if !out.stderr.is_empty() {
warn!(
"{:?}: STDERR:\n{}",
&cmd,
String::from_utf8_lossy(&out.stderr)
);
}
cmd.args(args);
info!("{:?}", &cmd);
let child = cmd.stdout(Stdio::piped()).spawn()?;
Ok(child)
Ok(out.stdout)
}
}

View File

@@ -1,11 +1,10 @@
use std::{
error::Error,
io::{stdout, Write},
time::{Duration, Instant},
time::Instant,
};
use itertools::Itertools;
use notmuch::{Notmuch, NotmuchError, SearchSummary, ThreadSet};
use letterbox_notmuch::Notmuch;
use rayon::iter::{ParallelBridge, ParallelIterator};
#[test]
@@ -23,11 +22,11 @@ fn parse_one() -> Result<(), Box<dyn Error>> {
let total = nm.count("*")? as f32;
let start = Instant::now();
nm.message_ids("*")?
.iter()
.enumerate()
.par_bridge()
.for_each(|(i, msg)| {
let msg = msg.expect("failed to unwrap msg");
let ts = nm
let _ts = nm
.show(&msg)
.expect(&format!("failed to show msg: {}", msg));
//println!("{:?}", ts);
@@ -77,11 +76,9 @@ fn parse_bulk() -> Result<(), Box<dyn Error>> {
.into_iter()
.enumerate()
//.par_bridge()
.for_each(|(i, chunk)| {
let msgs: Result<Vec<_>, _> = chunk.collect();
let msgs = msgs.expect("failed to unwrap msg");
.for_each(|(i, msgs)| {
let query = msgs.join(" OR ");
let ts = nm
let _ts = nm
.show(&query)
.expect(&format!("failed to show msgs: {}", query));
//println!("{:?}", ts);

View File

@@ -1,9 +1,20 @@
[package]
name = "procmail2notmuch"
version = "0.1.0"
edition = "2021"
name = "letterbox-procmail2notmuch"
description = "Tool for generating notmuch rules from procmail"
authors.workspace = true
edition.workspace = true
license.workspace = true
publish.workspace = true
repository.workspace = true
version.workspace = true
# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html
[dependencies]
anyhow = "1.0.69"
anyhow = "1.0.98"
clap = { version = "4.5.37", features = ["derive", "env"] }
letterbox-notmuch = { version = "0.17.9", registry = "xinu" }
letterbox-shared = { version = "0.17.9", registry = "xinu" }
serde = { version = "1.0.219", features = ["derive"] }
sqlx = { version = "0.8.5", features = ["postgres", "runtime-tokio"] }
tokio = { version = "1.44.2", features = ["rt", "macros", "rt-multi-thread"] }

View File

@@ -1,210 +1,36 @@
use std::{convert::Infallible, io::Write, str::FromStr};
use std::{collections::HashMap, io::Write};
#[derive(Debug, Default)]
enum MatchType {
From,
Sender,
To,
Cc,
Subject,
List,
DeliveredTo,
XForwardedTo,
ReplyTo,
XOriginalTo,
XSpam,
Body,
#[default]
Unknown,
}
#[derive(Debug, Default)]
struct Match {
match_type: MatchType,
needle: String,
use clap::{Parser, Subcommand};
use letterbox_shared::{cleanup_match, Match, MatchType, Rule};
use sqlx::{types::Json, PgPool};
#[derive(Debug, Subcommand)]
enum Mode {
Debug,
Notmuchrc,
LoadSql {
#[arg(short, long)]
dsn: String,
},
}
#[derive(Debug, Default)]
struct Rule {
matches: Vec<Match>,
tags: Vec<String>,
/// Simple program to greet a person
#[derive(Parser, Debug)]
#[command(version, about, long_about = None)]
struct Args {
#[arg(short, long, default_value = "/home/wathiede/dotfiles/procmailrc")]
input: String,
#[command(subcommand)]
mode: Mode,
}
fn unescape(s: &str) -> String {
s.replace('\\', "")
}
fn cleanup_match(prefix: &str, s: &str) -> String {
unescape(&s[prefix.len()..]).replace(".*", "")
}
mod matches {
pub const TO: &'static str = "TO";
pub const CC: &'static str = "Cc";
pub const TOCC: &'static str = "(TO|Cc)";
pub const FROM: &'static str = "From";
pub const SENDER: &'static str = "Sender";
pub const SUBJECT: &'static str = "Subject";
pub const DELIVERED_TO: &'static str = "Delivered-To";
pub const X_FORWARDED_TO: &'static str = "X-Forwarded-To";
pub const REPLY_TO: &'static str = "Reply-To";
pub const X_ORIGINAL_TO: &'static str = "X-Original-To";
pub const LIST_ID: &'static str = "List-ID";
pub const X_SPAM: &'static str = "X-Spam";
pub const X_SPAM_FLAG: &'static str = "X-Spam-Flag";
}
impl FromStr for Match {
type Err = Infallible;
fn from_str(s: &str) -> Result<Self, Self::Err> {
// Examples:
// "* 1^0 ^TOsonyrewards.com@xinu.tv"
// "* ^TOsonyrewards.com@xinu.tv"
let mut it = s.split_whitespace().skip(1);
let mut needle = it.next().unwrap();
if needle == "1^0" {
needle = it.next().unwrap();
}
let mut needle = vec![needle];
needle.extend(it);
let needle = needle.join(" ");
let first = needle.chars().nth(0).unwrap_or(' ');
use matches::*;
if first == '^' {
let needle = &needle[1..];
if needle.starts_with(TO) {
return Ok(Match {
match_type: MatchType::To,
needle: cleanup_match(TO, needle),
});
} else if needle.starts_with(FROM) {
return Ok(Match {
match_type: MatchType::From,
needle: cleanup_match(FROM, needle),
});
} else if needle.starts_with(CC) {
return Ok(Match {
match_type: MatchType::Cc,
needle: cleanup_match(CC, needle),
});
} else if needle.starts_with(TOCC) {
return Ok(Match {
match_type: MatchType::To,
needle: cleanup_match(TOCC, needle),
});
} else if needle.starts_with(SENDER) {
return Ok(Match {
match_type: MatchType::Sender,
needle: cleanup_match(SENDER, needle),
});
} else if needle.starts_with(SUBJECT) {
return Ok(Match {
match_type: MatchType::Subject,
needle: cleanup_match(SUBJECT, needle),
});
} else if needle.starts_with(X_ORIGINAL_TO) {
return Ok(Match {
match_type: MatchType::XOriginalTo,
needle: cleanup_match(X_ORIGINAL_TO, needle),
});
} else if needle.starts_with(LIST_ID) {
return Ok(Match {
match_type: MatchType::List,
needle: cleanup_match(LIST_ID, needle),
});
} else if needle.starts_with(REPLY_TO) {
return Ok(Match {
match_type: MatchType::ReplyTo,
needle: cleanup_match(REPLY_TO, needle),
});
} else if needle.starts_with(X_SPAM_FLAG) {
return Ok(Match {
match_type: MatchType::XSpam,
needle: '*'.to_string(),
});
} else if needle.starts_with(X_SPAM) {
return Ok(Match {
match_type: MatchType::XSpam,
needle: '*'.to_string(),
});
} else if needle.starts_with(DELIVERED_TO) {
return Ok(Match {
match_type: MatchType::DeliveredTo,
needle: cleanup_match(DELIVERED_TO, needle),
});
} else if needle.starts_with(X_FORWARDED_TO) {
return Ok(Match {
match_type: MatchType::XForwardedTo,
needle: cleanup_match(X_FORWARDED_TO, needle),
});
} else {
unreachable!("needle: '{needle}'")
}
} else {
return Ok(Match {
match_type: MatchType::Body,
needle: cleanup_match("", &needle),
});
}
}
}
fn notmuch_from_rules<W: Write>(mut w: W, rules: &[Rule]) -> anyhow::Result<()> {
// TODO(wathiede): if reindexing this many tags is too slow, see if combining rules per tag is
// faster.
let mut lines = Vec::new();
for r in rules {
for m in &r.matches {
for t in &r.tags {
if let MatchType::Unknown = m.match_type {
eprintln!("rule has unknown match {:?}", r);
continue;
}
let rule = match m.match_type {
MatchType::From => "from:",
// TODO(wathiede): something more specific?
MatchType::Sender => "from:",
MatchType::To => "to:",
MatchType::Cc => "to:",
MatchType::Subject => "subject:",
MatchType::List => "List-ID:",
MatchType::Body => "",
// TODO(wathiede): these will probably require adding fields to notmuch
// index. Handle them later.
MatchType::DeliveredTo
| MatchType::XForwardedTo
| MatchType::ReplyTo
| MatchType::XOriginalTo
| MatchType::XSpam => continue,
MatchType::Unknown => unreachable!(),
};
// Preserve unread status if run with --remove-all
lines.push(format!(
r#"-unprocessed +{} +unread -- is:unread tag:unprocessed {}"{}""#,
t, rule, m.needle
));
lines.push(format!(
// TODO(wathiede): this assumes `notmuch new` is configured to add
// `tag:unprocessed` to all new mail.
r#"-unprocessed +{} -- tag:unprocessed {}"{}""#,
t, rule, m.needle
));
}
}
}
lines.sort();
for l in lines {
writeln!(w, "{l}")?;
}
Ok(())
}
fn main() -> anyhow::Result<()> {
let input = "/home/wathiede/dotfiles/procmailrc";
#[tokio::main]
async fn main() -> anyhow::Result<()> {
let args = Args::parse();
let mut rules = Vec::new();
let mut cur_rule = Rule::default();
for l in std::fs::read_to_string(input)?.lines() {
for l in std::fs::read_to_string(args.input)?.lines() {
let l = if let Some(idx) = l.find('#') {
&l[..idx]
} else {
@@ -222,6 +48,9 @@ fn main() -> anyhow::Result<()> {
match first {
':' => {
// start of rule
// If carbon-copy flag present, don't stop on match
cur_rule.stop_on_match = !l.contains('c');
}
'*' => {
// add to current rule
@@ -230,26 +59,119 @@ fn main() -> anyhow::Result<()> {
}
'.' => {
// delivery to folder
cur_rule.tags.push(cleanup_match(
cur_rule.tag = cleanup_match(
"",
&l.replace('.', "/")
.replace(' ', "")
.trim_matches('/')
.to_string(),
));
);
rules.push(cur_rule);
cur_rule = Rule::default();
}
'/' => cur_rule = Rule::default(), // Ex. /dev/null
'|' => cur_rule = Rule::default(), // external command
'$' => {
// TODO(wathiede): tag messages with no other tag as 'inbox'
cur_rule.tags.push(cleanup_match("", "inbox"));
cur_rule.tag = cleanup_match("", "inbox");
rules.push(cur_rule);
cur_rule = Rule::default();
} // variable, should only be $DEFAULT in my config
_ => panic!("Unhandled first character '{}' {}", first, l),
_ => panic!("Unhandled first character '{}'\nLine: {}", first, l),
}
}
notmuch_from_rules(std::io::stdout(), &rules)?;
match args.mode {
Mode::Debug => print_rules(&rules),
Mode::Notmuchrc => notmuch_from_rules(std::io::stdout(), &rules)?,
Mode::LoadSql { dsn } => load_sql(&dsn, &rules).await?,
}
Ok(())
}
fn print_rules(rules: &[Rule]) {
let mut tally = HashMap::new();
for r in rules {
for m in &r.matches {
*tally.entry(m.match_type).or_insert(0) += 1;
}
}
let mut sorted: Vec<_> = tally.iter().map(|(k, v)| (v, k)).collect();
sorted.sort();
sorted.reverse();
for (v, k) in sorted {
println!("{k:?}: {v}");
}
}
fn notmuch_from_rules<W: Write>(mut w: W, rules: &[Rule]) -> anyhow::Result<()> {
// TODO(wathiede): if reindexing this many tags is too slow, see if combining rules per tag is
// faster.
let mut lines = Vec::new();
for r in rules {
for m in &r.matches {
let t = &r.tag;
if let MatchType::Unknown = m.match_type {
eprintln!("rule has unknown match {:?}", r);
continue;
}
let rule = match m.match_type {
MatchType::From => "from:",
// TODO(wathiede): something more specific?
MatchType::Sender => "from:",
MatchType::To => "to:",
MatchType::Cc => "to:",
MatchType::Subject => "subject:",
MatchType::ListId => "List-ID:",
MatchType::Body => "",
// TODO(wathiede): these will probably require adding fields to notmuch
// index. Handle them later.
MatchType::DeliveredTo
| MatchType::XForwardedTo
| MatchType::ReplyTo
| MatchType::XOriginalTo
| MatchType::XSpam => continue,
MatchType::Unknown => unreachable!(),
};
// Preserve unread status if run with --remove-all
lines.push(format!(
r#"-unprocessed +{} +unread -- is:unread tag:unprocessed {}"{}""#,
t, rule, m.needle
));
lines.push(format!(
// TODO(wathiede): this assumes `notmuch new` is configured to add
// `tag:unprocessed` to all new mail.
r#"-unprocessed +{} -- tag:unprocessed {}"{}""#,
t, rule, m.needle
));
}
}
lines.sort();
for l in lines {
writeln!(w, "{l}")?;
}
Ok(())
}
async fn load_sql(dsn: &str, rules: &[Rule]) -> anyhow::Result<()> {
let pool = PgPool::connect(dsn).await?;
println!("clearing email_rule table");
sqlx::query!("DELETE FROM email_rule")
.execute(&pool)
.await?;
for (order, rule) in rules.iter().enumerate() {
println!("inserting {order}: {rule:?}");
sqlx::query!(
r#"
INSERT INTO email_rule (sort_order, rule)
VALUES ($1, $2)
"#,
order as i32,
Json(rule) as _
)
.execute(&pool)
.await?;
}
Ok(())
}

10
procmail2notmuch/update.sh Executable file
View File

@@ -0,0 +1,10 @@
set -e
cd ~/dotfiles
git diff
scp nasx:.procmailrc procmailrc
git diff
cd ~/src/xinu.tv/letterbox/procmail2notmuch
cargo run > /tmp/notmuch.tags
mv /tmp/notmuch.tags ~/dotfiles/notmuch.tags
cd ~/dotfiles
git diff

6
renovate.json Normal file
View File

@@ -0,0 +1,6 @@
{
"$schema": "https://docs.renovatebot.com/renovate-schema.json",
"extends": [
"config:recommended"
]
}

View File

@@ -0,0 +1,5 @@
#!env bash
set -e -x
cargo-set-version set-version --bump patch
VERSION="$(awk -F\" '/^version/ {print $2}' server/Cargo.toml)"
git commit Cargo.lock */Cargo.toml -m "Bumping version to ${VERSION:?}"

View File

@@ -0,0 +1,32 @@
{
"db_name": "PostgreSQL",
"query": "SELECT\n site,\n name,\n count (\n NOT is_read\n OR NULL\n ) unread\nFROM\n post AS p\n JOIN feed AS f ON p.site = f.slug --\n -- TODO: figure this out to make the query faster when only looking for unread\n --WHERE\n -- (\n -- NOT $1\n -- OR NOT is_read\n -- )\nGROUP BY\n 1,\n 2\nORDER BY\n site\n",
"describe": {
"columns": [
{
"ordinal": 0,
"name": "site",
"type_info": "Text"
},
{
"ordinal": 1,
"name": "name",
"type_info": "Text"
},
{
"ordinal": 2,
"name": "unread",
"type_info": "Int8"
}
],
"parameters": {
"Left": []
},
"nullable": [
true,
true,
null
]
},
"hash": "2dcbedef656e1b725c5ba4fb67d31ce7962d8714449b2fb630f49a7ed1acc270"
}

View File

@@ -0,0 +1,70 @@
{
"db_name": "PostgreSQL",
"query": "SELECT\n date,\n is_read,\n link,\n site,\n summary,\n clean_summary,\n title,\n name,\n homepage\nFROM\n post AS p\nINNER JOIN feed AS f ON p.site = f.slug\nWHERE\n uid = $1\n",
"describe": {
"columns": [
{
"ordinal": 0,
"name": "date",
"type_info": "Timestamp"
},
{
"ordinal": 1,
"name": "is_read",
"type_info": "Bool"
},
{
"ordinal": 2,
"name": "link",
"type_info": "Text"
},
{
"ordinal": 3,
"name": "site",
"type_info": "Text"
},
{
"ordinal": 4,
"name": "summary",
"type_info": "Text"
},
{
"ordinal": 5,
"name": "clean_summary",
"type_info": "Text"
},
{
"ordinal": 6,
"name": "title",
"type_info": "Text"
},
{
"ordinal": 7,
"name": "name",
"type_info": "Text"
},
{
"ordinal": 8,
"name": "homepage",
"type_info": "Text"
}
],
"parameters": {
"Left": [
"Text"
]
},
"nullable": [
true,
true,
false,
true,
true,
true,
true,
true,
true
]
},
"hash": "383221a94bc3746322ba78e41cde37994440ee67dc32e88d2394c51211bde6cd"
}

View File

@@ -0,0 +1,32 @@
{
"db_name": "PostgreSQL",
"query": "SELECT\n p.id,\n link,\n clean_summary\nFROM\n post AS p\nINNER JOIN feed AS f ON p.site = f.slug -- necessary to weed out nzb posts\nWHERE\n search_summary IS NULL\n -- TODO remove AND link ~ '^<'\nORDER BY\n ROW_NUMBER() OVER (PARTITION BY site ORDER BY date DESC)\nLIMIT 100;\n",
"describe": {
"columns": [
{
"ordinal": 0,
"name": "id",
"type_info": "Int4"
},
{
"ordinal": 1,
"name": "link",
"type_info": "Text"
},
{
"ordinal": 2,
"name": "clean_summary",
"type_info": "Text"
}
],
"parameters": {
"Left": []
},
"nullable": [
false,
false,
true
]
},
"hash": "3d271b404f06497a5dcde68cf6bf07291d70fa56058ea736ac24e91d33050c04"
}

View File

@@ -0,0 +1,24 @@
{
"db_name": "PostgreSQL",
"query": "SELECT COUNT(*) AS count\nFROM\n post\nWHERE\n (\n $1::text IS NULL\n OR site = $1\n )\n AND (\n NOT $2\n OR NOT is_read\n )\n AND (\n $3::text IS NULL\n OR TO_TSVECTOR('english', search_summary)\n @@ WEBSEARCH_TO_TSQUERY('english', $3)\n )\n",
"describe": {
"columns": [
{
"ordinal": 0,
"name": "count",
"type_info": "Int8"
}
],
"parameters": {
"Left": [
"Text",
"Bool",
"Text"
]
},
"nullable": [
null
]
},
"hash": "8c1b3c78649135e98b89092237750088433f7ff1b7c2ddeedec553406ea9f203"
}

View File

@@ -0,0 +1,15 @@
{
"db_name": "PostgreSQL",
"query": "UPDATE\n post\nSET\n is_read = $1\nWHERE\n uid = $2\n",
"describe": {
"columns": [],
"parameters": {
"Left": [
"Bool",
"Text"
]
},
"nullable": []
},
"hash": "b39147b9d06171cb742141eda4675688cb702fb284758b1224ed3aa2d7f3b3d9"
}

View File

@@ -0,0 +1,15 @@
{
"db_name": "PostgreSQL",
"query": "UPDATE post SET search_summary = $1 WHERE id = $2",
"describe": {
"columns": [],
"parameters": {
"Left": [
"Text",
"Int4"
]
},
"nullable": []
},
"hash": "ef8327f039dbfa8f4e59b7a77a6411252a346bf51cf940024a17d9fbb2df173c"
}

View File

@@ -0,0 +1,56 @@
{
"db_name": "PostgreSQL",
"query": "SELECT\n site,\n date,\n is_read,\n title,\n uid,\n name\nFROM\n post p\n JOIN feed f ON p.site = f.slug\nWHERE\n ($1::text IS NULL OR site = $1)\n AND (\n NOT $2\n OR NOT is_read\n )\n AND (\n $5 :: text IS NULL\n OR to_tsvector('english', search_summary) @@ websearch_to_tsquery('english', $5)\n )\nORDER BY\n date DESC,\n title OFFSET $3\nLIMIT\n $4\n",
"describe": {
"columns": [
{
"ordinal": 0,
"name": "site",
"type_info": "Text"
},
{
"ordinal": 1,
"name": "date",
"type_info": "Timestamp"
},
{
"ordinal": 2,
"name": "is_read",
"type_info": "Bool"
},
{
"ordinal": 3,
"name": "title",
"type_info": "Text"
},
{
"ordinal": 4,
"name": "uid",
"type_info": "Text"
},
{
"ordinal": 5,
"name": "name",
"type_info": "Text"
}
],
"parameters": {
"Left": [
"Text",
"Bool",
"Int8",
"Int8",
"Text"
]
},
"nullable": [
true,
true,
true,
true,
false,
true
]
},
"hash": "fc4607f02cc76a5f3a6629cce4507c74f52ae44820897b47365da3f339d1da06"
}

View File

@@ -1,25 +1,67 @@
[package]
name = "server"
version = "0.1.0"
edition = "2021"
default-bin = "server"
name = "letterbox-server"
default-run = "letterbox-server"
description = "Backend for letterbox"
authors.workspace = true
edition.workspace = true
license.workspace = true
publish.workspace = true
repository.workspace = true
version.workspace = true
# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html
[dependencies]
rocket = { version = "0.5.0-rc.2", features = [ "json" ] }
rocket_cors = { git = "https://github.com/lawliet89/rocket_cors", branch = "master" }
notmuch = { path = "../notmuch" }
shared = { path = "../shared" }
serde_json = "1.0.87"
thiserror = "1.0.37"
serde = { version = "1.0.147", features = ["derive"] }
log = "0.4.17"
tokio = "1.26.0"
glog = "0.1.0"
chrono-tz = "0.10"
html2text = "0.15"
ammonia = "4.1.0"
anyhow = "1.0.98"
askama = { version = "0.14.0", features = ["derive"] }
async-graphql = { version = "7", features = ["log"] }
async-graphql-axum = "7.0.16"
async-trait = "0.1.88"
axum = { version = "0.8.3", features = ["ws"] }
axum-macros = "0.5.0"
build-info = "0.0.41"
cacher = { version = "0.2.0", registry = "xinu" }
chrono = "0.4.40"
clap = { version = "4.5.37", features = ["derive"] }
css-inline = "0.17.0"
flate2 = "1.1.2"
futures = "0.3.31"
headers = "0.4.0"
html-escape = "0.2.13"
ical = "0.11"
letterbox-notmuch = { path = "../notmuch", version = "0.17.40", registry = "xinu" }
letterbox-shared = { path = "../shared", version = "0.17.40", registry = "xinu" }
linkify = "0.10.0"
lol_html = "2.3.0"
mailparse = "0.16.1"
maplit = "1.0.2"
memmap = "0.7.0"
quick-xml = { version = "0.38.1", features = ["serialize"] }
regex = "1.11.1"
reqwest = { version = "0.12.15", features = ["blocking"] }
scraper = "0.24.0"
serde = { version = "1.0.219", features = ["derive"] }
serde_json = "1.0.140"
sqlx = { version = "0.8.5", features = ["postgres", "runtime-tokio", "time"] }
tantivy = { version = "0.25.0", optional = true }
thiserror = "2.0.12"
tokio = "1.44.2"
tower-http = { version = "0.6.2", features = ["trace"] }
tracing = "0.1.41"
url = "2.5.4"
urlencoding = "2.1.3"
#xtracing = { git = "http://git-private.h.xinu.tv/wathiede/xtracing.git" }
#xtracing = { path = "../../xtracing" }
xtracing = { version = "0.3.2", registry = "xinu" }
zip = "4.3.0"
[dependencies.rocket_contrib]
version = "0.4.11"
default-features = false
features = ["json"]
[build-dependencies]
build-info-build = "0.0.41"
[features]
#default = [ "tantivy" ]
tantivy = ["dep:tantivy"]

View File

@@ -1,9 +1,13 @@
[release]
address = "0.0.0.0"
port = 9345
newsreader_database_url = "postgres://newsreader@nixos-07.h.xinu.tv/newsreader"
newsreader_tantivy_db_path = "../target/database/newsreader"
[debug]
address = "0.0.0.0"
port = 9345
# Uncomment to make it production like.
#log_level = "critical"
newsreader_database_url = "postgres://newsreader@nixos-07.h.xinu.tv/newsreader"
newsreader_tantivy_db_path = "../target/database/newsreader"
slurp_cache_path = "/tmp/letterbox/slurp"

6
server/build.rs Normal file
View File

@@ -0,0 +1,6 @@
fn main() {
// Calling `build_info_build::build_script` collects all data and makes it available to `build_info::build_info!`
// and `build_info::format!` in the main program.
build_info_build::build_script();
println!("cargo:rerun-if-changed=templates");
}

View File

@@ -0,0 +1,3 @@
DROP INDEX IF EXISTS post_summary_idx;
DROP INDEX IF EXISTS post_site_idx;
DROP INDEX IF EXISTS post_title_idx;

View File

@@ -0,0 +1,3 @@
CREATE INDEX post_summary_idx ON post USING GIN (to_tsvector('english', summary));
CREATE INDEX post_site_idx ON post USING GIN (to_tsvector('english', site));
CREATE INDEX post_title_idx ON post USING GIN (to_tsvector('english', title));

View File

@@ -0,0 +1,24 @@
BEGIN;
ALTER TABLE IF EXISTS public."Email" DROP CONSTRAINT IF EXISTS email_avatar_fkey;
ALTER TABLE IF EXISTS public."EmailDisplayName" DROP CONSTRAINT IF EXISTS email_id_fk;
ALTER TABLE IF EXISTS public."Message" DROP CONSTRAINT IF EXISTS message_to_fkey;
ALTER TABLE IF EXISTS public."Message" DROP CONSTRAINT IF EXISTS message_cc_fkey;
ALTER TABLE IF EXISTS public."Message" DROP CONSTRAINT IF EXISTS message_from_fkey;
ALTER TABLE IF EXISTS public."Message" DROP CONSTRAINT IF EXISTS message_header_fkey;
ALTER TABLE IF EXISTS public."Message" DROP CONSTRAINT IF EXISTS message_file_fkey;
ALTER TABLE IF EXISTS public."Message" DROP CONSTRAINT IF EXISTS message_body_id_fkey;
ALTER TABLE IF EXISTS public."Message" DROP CONSTRAINT IF EXISTS message_thread_fkey;
ALTER TABLE IF EXISTS public."Message" DROP CONSTRAINT IF EXISTS message_tag_fkey;
DROP TABLE IF EXISTS public."Email";
DROP TABLE IF EXISTS public."EmailDisplayName";
DROP TABLE IF EXISTS public."Message";
DROP TABLE IF EXISTS public."Header";
DROP TABLE IF EXISTS public."File";
DROP TABLE IF EXISTS public."Avatar";
DROP TABLE IF EXISTS public."Body";
DROP TABLE IF EXISTS public."Thread";
DROP TABLE IF EXISTS public."Tag";
END;

View File

@@ -0,0 +1,174 @@
-- This script was generated by the ERD tool in pgAdmin 4.
-- Please log an issue at https://github.com/pgadmin-org/pgadmin4/issues/new/choose if you find any bugs, including reproduction steps.
BEGIN;
ALTER TABLE IF EXISTS public."Email" DROP CONSTRAINT IF EXISTS email_avatar_fkey;
ALTER TABLE IF EXISTS public."EmailDisplayName" DROP CONSTRAINT IF EXISTS email_id_fk;
ALTER TABLE IF EXISTS public."Message" DROP CONSTRAINT IF EXISTS message_to_fkey;
ALTER TABLE IF EXISTS public."Message" DROP CONSTRAINT IF EXISTS message_cc_fkey;
ALTER TABLE IF EXISTS public."Message" DROP CONSTRAINT IF EXISTS message_from_fkey;
ALTER TABLE IF EXISTS public."Message" DROP CONSTRAINT IF EXISTS message_header_fkey;
ALTER TABLE IF EXISTS public."Message" DROP CONSTRAINT IF EXISTS message_file_fkey;
ALTER TABLE IF EXISTS public."Message" DROP CONSTRAINT IF EXISTS message_body_id_fkey;
ALTER TABLE IF EXISTS public."Message" DROP CONSTRAINT IF EXISTS message_thread_fkey;
ALTER TABLE IF EXISTS public."Message" DROP CONSTRAINT IF EXISTS message_tag_fkey;
CREATE TABLE IF NOT EXISTS public."Email"
(
id integer NOT NULL GENERATED ALWAYS AS IDENTITY,
address text NOT NULL,
avatar_id integer,
PRIMARY KEY (id),
CONSTRAINT avatar_id UNIQUE (avatar_id)
);
CREATE TABLE IF NOT EXISTS public."EmailDisplayName"
(
id integer NOT NULL GENERATED ALWAYS AS IDENTITY,
email_id integer NOT NULL,
PRIMARY KEY (id)
);
CREATE TABLE IF NOT EXISTS public."Message"
(
id integer NOT NULL GENERATED ALWAYS AS IDENTITY,
subject text,
"from" integer,
"to" integer,
cc integer,
header_id integer,
hash text NOT NULL,
file_id integer NOT NULL,
date timestamp with time zone NOT NULL,
unread boolean NOT NULL,
body_id integer NOT NULL,
thread_id integer NOT NULL,
tag_id integer,
CONSTRAINT body_id UNIQUE (body_id)
);
CREATE TABLE IF NOT EXISTS public."Header"
(
id integer NOT NULL GENERATED ALWAYS AS IDENTITY,
key text NOT NULL,
value text NOT NULL,
PRIMARY KEY (id)
);
CREATE TABLE IF NOT EXISTS public."File"
(
id integer NOT NULL GENERATED ALWAYS AS IDENTITY,
path text NOT NULL,
PRIMARY KEY (id)
);
CREATE TABLE IF NOT EXISTS public."Avatar"
(
id integer NOT NULL GENERATED ALWAYS AS IDENTITY,
url text NOT NULL,
PRIMARY KEY (id)
);
CREATE TABLE IF NOT EXISTS public."Body"
(
id integer NOT NULL GENERATED ALWAYS AS IDENTITY,
text text NOT NULL,
PRIMARY KEY (id)
);
CREATE TABLE IF NOT EXISTS public."Thread"
(
id integer NOT NULL GENERATED ALWAYS AS IDENTITY,
PRIMARY KEY (id)
);
CREATE TABLE IF NOT EXISTS public."Tag"
(
id integer NOT NULL GENERATED ALWAYS AS IDENTITY,
name text NOT NULL,
display text,
fg_color integer,
bg_color integer,
PRIMARY KEY (id)
);
ALTER TABLE IF EXISTS public."Email"
ADD CONSTRAINT email_avatar_fkey FOREIGN KEY (avatar_id)
REFERENCES public."Avatar" (id) MATCH SIMPLE
ON UPDATE NO ACTION
ON DELETE NO ACTION
NOT VALID;
ALTER TABLE IF EXISTS public."EmailDisplayName"
ADD CONSTRAINT email_id_fk FOREIGN KEY (email_id)
REFERENCES public."Email" (id) MATCH SIMPLE
ON UPDATE NO ACTION
ON DELETE NO ACTION
NOT VALID;
ALTER TABLE IF EXISTS public."Message"
ADD CONSTRAINT message_to_fkey FOREIGN KEY ("to")
REFERENCES public."Email" (id) MATCH SIMPLE
ON UPDATE NO ACTION
ON DELETE NO ACTION
NOT VALID;
ALTER TABLE IF EXISTS public."Message"
ADD CONSTRAINT message_cc_fkey FOREIGN KEY (cc)
REFERENCES public."Email" (id) MATCH SIMPLE
ON UPDATE NO ACTION
ON DELETE NO ACTION
NOT VALID;
ALTER TABLE IF EXISTS public."Message"
ADD CONSTRAINT message_from_fkey FOREIGN KEY ("from")
REFERENCES public."Email" (id) MATCH SIMPLE
ON UPDATE NO ACTION
ON DELETE NO ACTION
NOT VALID;
ALTER TABLE IF EXISTS public."Message"
ADD CONSTRAINT message_header_fkey FOREIGN KEY (header_id)
REFERENCES public."Header" (id) MATCH SIMPLE
ON UPDATE NO ACTION
ON DELETE NO ACTION
NOT VALID;
ALTER TABLE IF EXISTS public."Message"
ADD CONSTRAINT message_file_fkey FOREIGN KEY (file_id)
REFERENCES public."File" (id) MATCH SIMPLE
ON UPDATE NO ACTION
ON DELETE NO ACTION
NOT VALID;
ALTER TABLE IF EXISTS public."Message"
ADD CONSTRAINT message_body_id_fkey FOREIGN KEY (body_id)
REFERENCES public."Body" (id) MATCH SIMPLE
ON UPDATE NO ACTION
ON DELETE NO ACTION
NOT VALID;
ALTER TABLE IF EXISTS public."Message"
ADD CONSTRAINT message_thread_fkey FOREIGN KEY (thread_id)
REFERENCES public."Thread" (id) MATCH SIMPLE
ON UPDATE NO ACTION
ON DELETE NO ACTION
NOT VALID;
ALTER TABLE IF EXISTS public."Message"
ADD CONSTRAINT message_tag_fkey FOREIGN KEY (tag_id)
REFERENCES public."Tag" (id) MATCH SIMPLE
ON UPDATE NO ACTION
ON DELETE NO ACTION
NOT VALID;
END;

View File

@@ -0,0 +1,3 @@
-- Add down migration script here
ALTER TABLE
post DROP CONSTRAINT post_link_key;

View File

@@ -0,0 +1,28 @@
WITH dupes AS (
SELECT
uid,
link,
Row_number() over(
PARTITION by link
ORDER BY
link
) AS RowNumber
FROM
post
)
DELETE FROM
post
WHERE
uid IN (
SELECT
uid
FROM
dupes
WHERE
RowNumber > 1
);
ALTER TABLE
post
ADD
UNIQUE (link);

View File

@@ -0,0 +1,7 @@
ALTER TABLE
post
ALTER COLUMN
link DROP NOT NULL;
ALTER TABLE
post DROP CONSTRAINT link;

View File

@@ -0,0 +1,17 @@
DELETE FROM
post
WHERE
link IS NULL
OR link = '';
ALTER TABLE
post
ALTER COLUMN
link
SET
NOT NULL;
ALTER TABLE
post
ADD
CONSTRAINT link CHECK (link <> '');

View File

@@ -0,0 +1,3 @@
DROP TABLE IF EXISTS email_address;
DROP TABLE IF EXISTS photo;
DROP TABLE IF EXISTS google_person;

View File

@@ -0,0 +1,19 @@
-- Add up migration script here
CREATE TABLE IF NOT EXISTS google_person (
id SERIAL PRIMARY KEY,
resource_name TEXT NOT NULL UNIQUE,
display_name TEXT NOT NULL
);
CREATE TABLE IF NOT EXISTS email_photo (
id SERIAL PRIMARY KEY,
google_person_id INTEGER REFERENCES google_person (id) UNIQUE,
url TEXT NOT NULL
);
CREATE TABLE IF NOT EXISTS email_address (
id SERIAL PRIMARY KEY,
address TEXT NOT NULL UNIQUE,
email_photo_id INTEGER REFERENCES email_photo (id),
google_person_id INTEGER REFERENCES google_person (id)
);

View File

@@ -0,0 +1,5 @@
-- Add down migration script here
DROP INDEX post_summary_idx;
CREATE INDEX post_summary_idx ON post USING gin (
to_tsvector('english', summary)
);

View File

@@ -0,0 +1,11 @@
-- Something like this around summary in the idx w/ tsvector
DROP INDEX post_summary_idx;
CREATE INDEX post_summary_idx ON post USING gin (to_tsvector(
'english',
regexp_replace(
regexp_replace(summary, '<[^>]+>', ' ', 'g'),
'\s+',
' ',
'g'
)
));

View File

@@ -0,0 +1,2 @@
-- Add down migration script here
DROP INDEX nzb_posts_created_at_idx;

View File

@@ -0,0 +1,2 @@
-- Add up migration script here
CREATE INDEX nzb_posts_created_at_idx ON nzb_posts USING btree (created_at);

View File

@@ -0,0 +1,15 @@
-- Add down migration script here
BEGIN;
DROP INDEX IF EXISTS post_search_summary_idx;
ALTER TABLE post DROP search_summary;
-- CREATE INDEX post_summary_idx ON post USING gin (to_tsvector(
-- 'english',
-- regexp_replace(
-- regexp_replace(summary, '<[^>]+>', ' ', 'g'),
-- '\s+',
-- ' ',
-- 'g'
-- )
-- ));
COMMIT;

View File

@@ -0,0 +1,14 @@
-- Add up migration script here
BEGIN;
DROP INDEX IF EXISTS post_summary_idx;
ALTER TABLE post ADD search_summary TEXT;
CREATE INDEX post_search_summary_idx ON post USING gin (
to_tsvector('english', search_summary)
);
UPDATE post SET search_summary = regexp_replace(
regexp_replace(summary, '<[^>]+>', ' ', 'g'),
'\s+',
' ',
'g'
);
COMMIT;

View File

@@ -0,0 +1,20 @@
-- Bad examples:
-- https://nzbfinder.ws/getnzb/d2c3e5a08abadd985dccc6a574122892030b6a9a.nzb&i=95972&r=b55082d289937c050dedc203c9653850
-- https://nzbfinder.ws/getnzb?id=45add174-7da4-4445-bf2b-a67dbbfc07fe.nzb&r=b55082d289937c050dedc203c9653850
-- https://nzbfinder.ws/api/v1/getnzb?id=82486020-c192-4fa0-a7e7-798d7d72e973.nzb&r=b55082d289937c050dedc203c9653850
UPDATE nzb_posts
SET link =
regexp_replace(
regexp_replace(
regexp_replace(
link,
'https://nzbfinder.ws/getnzb/',
'https://nzbfinder.ws/api/v1/getnzb?id='
),
'https://nzbfinder.ws/getnzb',
'https://nzbfinder.ws/api/v1/getnzb'
),
'&r=',
'&apikey='
)
;

View File

@@ -0,0 +1,3 @@
DROP TABLE IF NOT EXISTS email_rule;
-- Add down migration script here

View File

@@ -0,0 +1,5 @@
CREATE TABLE IF NOT EXISTS email_rule (
id integer NOT NULL GENERATED ALWAYS AS IDENTITY,
sort_order integer NOT NULL,
rule jsonb NOT NULL
);

View File

@@ -0,0 +1,2 @@
-- Add down migration script here
ALTER TABLE feed DROP COLUMN IF EXISTS disabled;

View File

@@ -0,0 +1,2 @@
-- Add up migration script here
ALTER TABLE feed ADD disabled boolean;

14
server/sql/all-posts.sql Normal file
View File

@@ -0,0 +1,14 @@
SELECT
site,
title,
summary,
link,
date,
is_read,
uid,
p.id id
FROM
post AS p
JOIN feed AS f ON p.site = f.slug -- necessary to weed out nzb posts
ORDER BY
date DESC;

6
server/sql/all-uids.sql Normal file
View File

@@ -0,0 +1,6 @@
SELECT
uid
FROM
post AS p
JOIN feed AS f ON p.site = f.slug -- necessary to weed out nzb posts
;

17
server/sql/count.sql Normal file
View File

@@ -0,0 +1,17 @@
SELECT COUNT(*) AS count
FROM
post
WHERE
(
$1::text IS NULL
OR site = $1
)
AND (
NOT $2
OR NOT is_read
)
AND (
$3::text IS NULL
OR TO_TSVECTOR('english', search_summary)
@@ WEBSEARCH_TO_TSQUERY('english', $3)
)

View File

@@ -0,0 +1 @@
SELECT rule as "rule: Json<Rule>" FROM email_rule ORDER BY sort_order

View File

@@ -0,0 +1,13 @@
SELECT
p.id,
link,
clean_summary
FROM
post AS p
INNER JOIN feed AS f ON p.site = f.slug -- necessary to weed out nzb posts
WHERE
search_summary IS NULL
-- TODO remove AND link ~ '^<'
ORDER BY
ROW_NUMBER() OVER (PARTITION BY site ORDER BY date DESC)
LIMIT 100;

View File

@@ -0,0 +1 @@
SELECT url FROM email_photo ep JOIN email_address ea ON ep.id = ea.email_photo_id WHERE address = $1

View File

@@ -0,0 +1,14 @@
SELECT
site AS "site!",
title AS "title!",
summary AS "summary!",
link AS "link!",
date AS "date!",
is_read AS "is_read!",
uid AS "uid!",
p.id id
FROM
post p
JOIN feed f ON p.site = f.slug
WHERE
uid = ANY ($1);

View File

@@ -0,0 +1,6 @@
UPDATE
post
SET
is_read = $1
WHERE
uid = $2

21
server/sql/tags.sql Normal file
View File

@@ -0,0 +1,21 @@
SELECT
site,
name,
count (
NOT is_read
OR NULL
) unread
FROM
post AS p
JOIN feed AS f ON p.site = f.slug --
-- TODO: figure this out to make the query faster when only looking for unread
--WHERE
-- (
-- NOT $1
-- OR NOT is_read
-- )
GROUP BY
1,
2
ORDER BY
site

15
server/sql/thread.sql Normal file
View File

@@ -0,0 +1,15 @@
SELECT
date,
is_read,
link,
site,
summary,
clean_summary,
title,
name,
homepage
FROM
post AS p
INNER JOIN feed AS f ON p.site = f.slug
WHERE
uid = $1

View File

@@ -0,0 +1,14 @@
SELECT
site,
date,
is_read,
title,
uid,
name
FROM
post p
JOIN feed f ON p.site = f.slug
WHERE
uid = ANY ($1)
ORDER BY
date DESC;

25
server/sql/threads.sql Normal file
View File

@@ -0,0 +1,25 @@
SELECT
site,
date,
is_read,
title,
uid,
name
FROM
post p
JOIN feed f ON p.site = f.slug
WHERE
($1::text IS NULL OR site = $1)
AND (
NOT $2
OR NOT is_read
)
AND (
$5 :: text IS NULL
OR to_tsvector('english', search_summary) @@ websearch_to_tsquery('english', $5)
)
ORDER BY
date DESC,
title OFFSET $3
LIMIT
$4

View File

@@ -0,0 +1,13 @@
select t.id, tt.tokid, tt.alias, length(t.token), t.token from (
select id, (ts_parse('default',
-- regexp_replace(
-- regexp_replace(summary, '<[^>]+>', ' ', 'g'),
-- '\s+',
-- ' ',
-- 'g'
-- )
summary
)).* from post) t
inner join ts_token_type('default') tt
on t.tokid = tt.tokid
where length(token) >= 2*1024;

View File

@@ -0,0 +1,16 @@
use std::fs;
use letterbox_server::sanitize_html;
fn main() -> anyhow::Result<()> {
let mut args = std::env::args().skip(1);
let src = args.next().expect("source not specified");
let dst = args.next().expect("destination not specified");
println!("Sanitizing {src} into {dst}");
let bytes = fs::read(src)?;
let html = String::from_utf8_lossy(&bytes);
let html = sanitize_html(&html, "", &None)?;
fs::write(dst, html)?;
Ok(())
}

View File

@@ -0,0 +1,21 @@
use std::fs;
use url::Url;
fn main() -> anyhow::Result<()> {
println!("PWD: {}", std::env::current_dir()?.display());
let _url = "https://slashdot.org/story/25/01/24/1813201/walgreens-replaced-fridge-doors-with-smart-screens-its-now-a-200-million-fiasco?utm_source=rss1.0mainlinkanon&utm_medium=feed";
let _url = "https://hackaday.com/2025/01/24/hackaday-podcast-episode-305-caustic-clocks-practice-bones-and-brick-layers/";
let _url = "https://theonion.com/monster-devastated-to-see-film-depicting-things-he-told-guillermo-del-toro-in-confidence/";
let _url = "https://trofi.github.io/posts/330-another-nix-language-nondeterminism-example.html";
let _url = "https://blog.cloudflare.com/ddos-threat-report-for-2024-q4/";
let url = "https://trofi.github.io/posts/330-another-nix-language-nondeterminism-example.html";
let body = reqwest::blocking::get(url)?.text()?;
let output = "/tmp/h2md/output.html";
let inliner = css_inline::CSSInliner::options()
.base_url(Url::parse(url).ok())
.build();
let inlined = inliner.inline(&body)?;
fs::write(output, inlined)?;
Ok(())
}

View File

@@ -0,0 +1,344 @@
// Rocket generates a lot of warnings for handlers
// TODO: figure out why
#![allow(unreachable_patterns)]
use std::{error::Error, net::SocketAddr, sync::Arc, time::Duration};
use async_graphql::{extensions, http::GraphiQLSource, Schema};
use async_graphql_axum::{GraphQL, GraphQLSubscription};
//allows to extract the IP of connecting user
use axum::extract::connect_info::ConnectInfo;
use axum::{
extract::{self, ws::WebSocketUpgrade, Query, State},
http::{header, StatusCode},
response::{self, IntoResponse, Response},
routing::{any, get, post},
Router,
};
use cacher::FilesystemCacher;
use clap::Parser;
use letterbox_notmuch::Notmuch;
#[cfg(feature = "tantivy")]
use letterbox_server::tantivy::TantivyConnection;
use letterbox_server::{
graphql::{compute_catchup_ids, Attachment, MutationRoot, QueryRoot, SubscriptionRoot},
nm::{attachment_bytes, cid_attachment_bytes, label_unprocessed},
ws::ConnectionTracker,
};
use letterbox_shared::WebsocketMessage;
use serde::Deserialize;
use sqlx::postgres::PgPool;
use tokio::{net::TcpListener, sync::Mutex};
use tower_http::trace::{DefaultMakeSpan, TraceLayer};
use tracing::{error, info};
// Make our own error that wraps `ServerError`.
struct AppError(letterbox_server::ServerError);
// Tell axum how to convert `AppError` into a response.
impl IntoResponse for AppError {
fn into_response(self) -> Response {
(
StatusCode::INTERNAL_SERVER_ERROR,
format!("Something went wrong: {}", self.0),
)
.into_response()
}
}
// This enables using `?` on functions that return `Result<_, letterbox_server::Error>` to turn them into
// `Result<_, AppError>`. That way you don't need to do that manually.
impl<E> From<E> for AppError
where
E: Into<letterbox_server::ServerError>,
{
fn from(err: E) -> Self {
Self(err.into())
}
}
fn inline_attachment_response(attachment: Attachment) -> impl IntoResponse {
info!("attachment filename {:?}", attachment.filename);
let mut hdr_map = headers::HeaderMap::new();
if let Some(filename) = attachment.filename {
hdr_map.insert(
header::CONTENT_DISPOSITION,
format!(r#"inline; filename="{}""#, filename)
.parse()
.unwrap(),
);
}
if let Some(ct) = attachment.content_type {
hdr_map.insert(header::CONTENT_TYPE, ct.parse().unwrap());
}
info!("hdr_map {hdr_map:?}");
(hdr_map, attachment.bytes).into_response()
}
fn download_attachment_response(attachment: Attachment) -> impl IntoResponse {
info!("attachment filename {:?}", attachment.filename);
let mut hdr_map = headers::HeaderMap::new();
if let Some(filename) = attachment.filename {
hdr_map.insert(
header::CONTENT_DISPOSITION,
format!(r#"attachment; filename="{}""#, filename)
.parse()
.unwrap(),
);
}
if let Some(ct) = attachment.content_type {
hdr_map.insert(header::CONTENT_TYPE, ct.parse().unwrap());
}
info!("hdr_map {hdr_map:?}");
(hdr_map, attachment.bytes).into_response()
}
#[axum_macros::debug_handler]
async fn view_attachment(
State(AppState { nm, .. }): State<AppState>,
extract::Path((id, idx, _)): extract::Path<(String, String, String)>,
) -> Result<impl IntoResponse, AppError> {
let mid = if id.starts_with("id:") {
id.to_string()
} else {
format!("id:{}", id)
};
info!("view attachment {mid} {idx}");
let idx: Vec<_> = idx
.split('.')
.map(|s| s.parse().expect("not a usize"))
.collect();
let attachment = attachment_bytes(&nm, &mid, &idx)?;
Ok(inline_attachment_response(attachment))
}
async fn download_attachment(
State(AppState { nm, .. }): State<AppState>,
extract::Path((id, idx, _)): extract::Path<(String, String, String)>,
) -> Result<impl IntoResponse, AppError> {
let mid = if id.starts_with("id:") {
id.to_string()
} else {
format!("id:{}", id)
};
info!("download attachment message id '{mid}' idx '{idx}'");
let idx: Vec<_> = idx
.split('.')
.filter(|s| !s.is_empty())
.map(|s| s.parse().expect("not a usize"))
.collect();
let attachment = attachment_bytes(&nm, &mid, &idx)?;
Ok(download_attachment_response(attachment))
}
async fn view_cid(
State(AppState { nm, .. }): State<AppState>,
extract::Path((id, cid)): extract::Path<(String, String)>,
) -> Result<impl IntoResponse, AppError> {
let mid = if id.starts_with("id:") {
id.to_string()
} else {
format!("id:{}", id)
};
info!("view cid attachment {mid} {cid}");
let attachment = cid_attachment_bytes(&nm, &mid, &cid)?;
Ok(inline_attachment_response(attachment))
}
// TODO make this work with gitea message ids like `wathiede/letterbox/pulls/91@git.z.xinu.tv`
async fn view_original(
State(AppState { nm, .. }): State<AppState>,
extract::Path(id): extract::Path<String>,
) -> Result<impl IntoResponse, AppError> {
info!("view_original {id}");
let bytes = nm.show_original(&id)?;
let s = String::from_utf8_lossy(&bytes).to_string();
Ok(s.into_response())
}
async fn graphiql() -> impl IntoResponse {
response::Html(
GraphiQLSource::build()
.endpoint("/api/graphql/")
.subscription_endpoint("/api/graphql/ws")
.finish(),
)
}
async fn start_ws(
ws: WebSocketUpgrade,
ConnectInfo(addr): ConnectInfo<SocketAddr>,
State(AppState {
connection_tracker, ..
}): State<AppState>,
) -> impl IntoResponse {
info!("intiating websocket connection for {addr}");
ws.on_upgrade(async move |socket| connection_tracker.lock().await.add_peer(socket, addr).await)
}
#[derive(Debug, Deserialize)]
struct NotificationParams {
delay_ms: Option<u64>,
num_unprocessed: Option<usize>,
}
async fn send_refresh_websocket_handler(
State(AppState {
nm,
pool,
connection_tracker,
..
}): State<AppState>,
params: Query<NotificationParams>,
) -> impl IntoResponse {
info!("send_refresh_websocket_handler params {params:?}");
if let Some(delay_ms) = params.delay_ms {
let delay = Duration::from_millis(delay_ms);
info!("sleeping {delay:?}");
tokio::time::sleep(delay).await;
}
let limit = match params.num_unprocessed {
Some(0) => None,
Some(limit) => Some(limit),
None => Some(10),
};
let mut ids = None;
match label_unprocessed(&nm, &pool, false, limit, "tag:unprocessed").await {
Ok(i) => ids = Some(i),
Err(err) => error!("Failed to label_unprocessed: {err:?}"),
};
connection_tracker
.lock()
.await
.send_message_all(WebsocketMessage::RefreshMessages)
.await;
if let Some(ids) = ids {
format!("{ids:?}")
} else {
"refresh triggered".to_string()
}
}
async fn watch_new(
nm: Notmuch,
pool: PgPool,
conn_tracker: Arc<Mutex<ConnectionTracker>>,
poll_time: Duration,
) -> Result<(), async_graphql::Error> {
async fn watch_new_iteration(
nm: &Notmuch,
pool: &PgPool,
conn_tracker: Arc<Mutex<ConnectionTracker>>,
old_ids: &[String],
) -> Result<Vec<String>, async_graphql::Error> {
let ids = compute_catchup_ids(&nm, &pool, "is:unread").await?;
info!("old_ids: {} ids: {}", old_ids.len(), ids.len());
if old_ids != ids {
label_unprocessed(&nm, &pool, false, Some(100), "tag:unprocessed").await?;
conn_tracker
.lock()
.await
.send_message_all(WebsocketMessage::RefreshMessages)
.await
}
Ok(ids)
}
let mut old_ids = Vec::new();
loop {
old_ids = match watch_new_iteration(&nm, &pool, conn_tracker.clone(), &old_ids).await {
Ok(old_ids) => old_ids,
Err(err) => {
error!("watch_new_iteration failed: {err:?}");
continue;
}
};
tokio::time::sleep(poll_time).await;
}
}
#[derive(Clone)]
struct AppState {
nm: Notmuch,
pool: PgPool,
connection_tracker: Arc<Mutex<ConnectionTracker>>,
}
#[derive(Parser)]
#[command(version, about, long_about = None)]
struct Cli {
#[arg(short, long, default_value = "0.0.0.0:9345")]
addr: SocketAddr,
newsreader_database_url: String,
newsreader_tantivy_db_path: String,
slurp_cache_path: String,
}
#[tokio::main]
async fn main() -> Result<(), Box<dyn Error>> {
let cli = Cli::parse();
let _guard = xtracing::init(env!("CARGO_BIN_NAME"))?;
build_info::build_info!(fn bi);
info!("Build Info: {}", letterbox_shared::build_version(bi));
if !std::fs::exists(&cli.slurp_cache_path)? {
info!("Creating slurp cache @ '{}'", &cli.slurp_cache_path);
std::fs::create_dir_all(&cli.slurp_cache_path)?;
}
let pool = PgPool::connect(&cli.newsreader_database_url).await?;
let nm = Notmuch::default();
sqlx::migrate!("./migrations").run(&pool).await?;
#[cfg(feature = "tantivy")]
let tantivy_conn = TantivyConnection::new(&cli.newsreader_tantivy_db_path)?;
let cacher = FilesystemCacher::new(&cli.slurp_cache_path)?;
let schema = Schema::build(QueryRoot, MutationRoot, SubscriptionRoot)
.data(nm.clone())
.data(cacher)
.data(pool.clone());
let schema = schema.extension(extensions::Logger).finish();
let connection_tracker = Arc::new(Mutex::new(ConnectionTracker::default()));
let ct = Arc::clone(&connection_tracker);
let poll_time = Duration::from_secs(60);
let _h = tokio::spawn(watch_new(nm.clone(), pool.clone(), ct, poll_time));
let api_routes = Router::new()
.route(
"/download/attachment/{id}/{idx}/{*rest}",
get(download_attachment),
)
.route("/view/attachment/{id}/{idx}/{*rest}", get(view_attachment))
.route("/original/{id}", get(view_original))
.route("/cid/{id}/{cid}", get(view_cid))
.route("/ws", any(start_ws))
.route_service("/graphql/ws", GraphQLSubscription::new(schema.clone()))
.route(
"/graphql/",
get(graphiql).post_service(GraphQL::new(schema.clone())),
);
let notification_routes = Router::new()
.route("/mail", post(send_refresh_websocket_handler))
.route("/news", post(send_refresh_websocket_handler));
let app = Router::new()
.nest("/api", api_routes)
.nest("/notification", notification_routes)
.with_state(AppState {
nm,
pool,
connection_tracker,
})
.layer(
TraceLayer::new_for_http()
.make_span_with(DefaultMakeSpan::default().include_headers(true)),
);
let listener = TcpListener::bind(cli.addr).await.unwrap();
tracing::info!("listening on {}", listener.local_addr().unwrap());
axum::serve(
listener,
app.into_make_service_with_connect_info::<SocketAddr>(),
)
.await
.unwrap();
Ok(())
}

View File

@@ -1,172 +0,0 @@
#[macro_use]
extern crate rocket;
use std::{error::Error, io::Cursor, str::FromStr};
use glog::Flags;
use notmuch::{Notmuch, NotmuchError, ThreadSet};
use rocket::{
http::{ContentType, Header},
request::Request,
response::{Debug, Responder},
serde::json::Json,
Response, State,
};
use rocket_cors::{AllowedHeaders, AllowedOrigins};
use server::{error::ServerError, nm::threadset_to_messages};
use shared::Message;
#[get("/")]
fn hello() -> &'static str {
"Hello, world!"
}
#[get("/refresh")]
async fn refresh(nm: &State<Notmuch>) -> Result<Json<String>, Debug<NotmuchError>> {
Ok(Json(String::from_utf8_lossy(&nm.new()?).to_string()))
}
#[get("/search")]
async fn search_all(
nm: &State<Notmuch>,
) -> Result<Json<shared::SearchResult>, Debug<NotmuchError>> {
search(nm, "*", None, None).await
}
#[get("/search/<query>?<page>&<results_per_page>")]
async fn search(
nm: &State<Notmuch>,
query: &str,
page: Option<usize>,
results_per_page: Option<usize>,
) -> Result<Json<shared::SearchResult>, Debug<NotmuchError>> {
let page = page.unwrap_or(0);
let results_per_page = results_per_page.unwrap_or(10);
let query = urlencoding::decode(query).map_err(NotmuchError::from)?;
info!(" search '{query}'");
let res = shared::SearchResult {
summary: nm.search(&query, page * results_per_page, results_per_page)?,
query: query.to_string(),
page,
results_per_page,
total: nm.count(&query)?,
};
Ok(Json(res))
}
#[get("/show/<query>/pretty")]
async fn show_pretty(
nm: &State<Notmuch>,
query: &str,
) -> Result<Json<Vec<Message>>, Debug<ServerError>> {
let query = urlencoding::decode(query).map_err(|e| ServerError::from(NotmuchError::from(e)))?;
let res = threadset_to_messages(nm.show(&query).map_err(ServerError::from)?)?;
Ok(Json(res))
}
#[get("/show/<query>")]
async fn show(nm: &State<Notmuch>, query: &str) -> Result<Json<ThreadSet>, Debug<NotmuchError>> {
let query = urlencoding::decode(query).map_err(NotmuchError::from)?;
let res = nm.show(&query)?;
Ok(Json(res))
}
struct PartResponder {
bytes: Vec<u8>,
filename: Option<String>,
}
impl<'r, 'o: 'r> Responder<'r, 'o> for PartResponder {
fn respond_to(self, _: &'r Request<'_>) -> rocket::response::Result<'o> {
let mut resp = Response::build();
if let Some(filename) = self.filename {
info!("filename {:?}", filename);
resp.header(Header::new(
"Content-Disposition",
format!(r#"attachment; filename="{}""#, filename),
))
.header(ContentType::Binary);
}
resp.sized_body(self.bytes.len(), Cursor::new(self.bytes))
.ok()
}
}
#[get("/original/<id>/part/<part>")]
async fn original_part(
nm: &State<Notmuch>,
id: &str,
part: usize,
) -> Result<PartResponder, Debug<NotmuchError>> {
let mid = if id.starts_with("id:") {
id.to_string()
} else {
format!("id:{}", id)
};
let meta = nm.show_part(&mid, part)?;
let res = nm.show_original_part(&mid, part)?;
Ok(PartResponder {
bytes: res,
filename: meta.filename,
})
}
#[get("/original/<id>")]
async fn original(
nm: &State<Notmuch>,
id: &str,
) -> Result<(ContentType, Vec<u8>), Debug<NotmuchError>> {
let mid = if id.starts_with("id:") {
id.to_string()
} else {
format!("id:{}", id)
};
let res = nm.show_original(&mid)?;
Ok((ContentType::Plain, res))
}
#[rocket::main]
async fn main() -> Result<(), Box<dyn Error>> {
glog::new()
.init(Flags {
colorlogtostderr: true,
//alsologtostderr: true, // use logtostderr to only write to stderr and not to files
logtostderr: true,
..Default::default()
})
.unwrap();
let allowed_origins = AllowedOrigins::all();
let cors = rocket_cors::CorsOptions {
allowed_origins,
allowed_methods: vec!["Get"]
.into_iter()
.map(|s| FromStr::from_str(s).unwrap())
.collect(),
allowed_headers: AllowedHeaders::some(&["Authorization", "Accept"]),
allow_credentials: true,
..Default::default()
}
.to_cors()?;
let _ = rocket::build()
.mount(
"/",
routes![
original_part,
original,
hello,
refresh,
search_all,
search,
show_pretty,
show
],
)
.attach(cors)
.manage(Notmuch::default())
//.manage(Notmuch::with_config("../notmuch/testdata/notmuch.config"))
.launch()
.await?;
Ok(())
}

View File

@@ -0,0 +1,39 @@
use std::error::Error;
use clap::Parser;
use letterbox_notmuch::Notmuch;
use letterbox_server::nm::label_unprocessed;
use sqlx::postgres::PgPool;
use tracing::info;
#[derive(Parser)]
#[command(version, about, long_about = None)]
struct Cli {
#[arg(short, long)]
newsreader_database_url: String,
#[arg(short, long, default_value = "10")]
/// Set to 0 to process all matches
messages_to_process: usize,
#[arg(short, long, default_value = "false")]
execute: bool,
/// Process messages matching this notmuch query
#[arg(short, long, default_value = "tag:unprocessed")]
query: String,
}
#[tokio::main]
async fn main() -> Result<(), Box<dyn Error>> {
let cli = Cli::parse();
let _guard = xtracing::init(env!("CARGO_BIN_NAME"))?;
build_info::build_info!(fn bi);
info!("Build Info: {}", letterbox_shared::build_version(bi));
let pool = PgPool::connect(&cli.newsreader_database_url).await?;
let nm = Notmuch::default();
let limit = if cli.messages_to_process > 0 {
Some(cli.messages_to_process)
} else {
None
};
label_unprocessed(&nm, &pool, !cli.execute, limit, &cli.query).await?;
Ok(())
}

File diff suppressed because it is too large Load Diff

7
server/src/config.rs Normal file
View File

@@ -0,0 +1,7 @@
use serde::Deserialize;
#[derive(Deserialize)]
pub struct Config {
pub newsreader_database_url: String,
pub newsreader_tantivy_db_path: String,
pub slurp_cache_path: String,
}

2374
server/src/email_extract.rs Normal file

File diff suppressed because it is too large Load Diff

View File

@@ -1,9 +1,50 @@
use std::{convert::Infallible, str::Utf8Error, string::FromUtf8Error};
use mailparse::MailParseError;
#[cfg(feature = "tantivy")]
use tantivy::{query::QueryParserError, TantivyError};
use thiserror::Error;
use crate::TransformError;
#[derive(Error, Debug)]
pub enum ServerError {
#[error("notmuch")]
NotmuchError(#[from] notmuch::NotmuchError),
#[error("notmuch: {0}")]
NotmuchError(#[from] letterbox_notmuch::NotmuchError),
#[error("flatten")]
FlattenError,
#[error("mail parse error: {0}")]
MailParseError(#[from] MailParseError),
#[error("IO error: {0}")]
IoError(#[from] std::io::Error),
#[error("attachement not found")]
PartNotFound,
#[error("sqlx error: {0}")]
SQLXError(#[from] sqlx::Error),
#[error("html transform error: {0}")]
TransformError(#[from] TransformError),
#[error("UTF8 error: {0}")]
Utf8Error(#[from] Utf8Error),
#[error("FromUTF8 error: {0}")]
FromUtf8Error(#[from] FromUtf8Error),
#[error("error: {0}")]
StringError(String),
#[error("invalid url: {0}")]
UrlParseError(#[from] url::ParseError),
#[cfg(feature = "tantivy")]
#[error("tantivy error: {0}")]
TantivyError(#[from] TantivyError),
#[cfg(feature = "tantivy")]
#[error("tantivy query parse error: {0}")]
QueryParseError(#[from] QueryParserError),
#[error("impossible: {0}")]
InfaillibleError(#[from] Infallible),
#[error("askama error: {0}")]
AskamaError(#[from] askama::Error),
#[error("xml error: {0}")]
XmlError(#[from] quick_xml::Error),
#[error("xml encoding error: {0}")]
XmlEncodingError(#[from] quick_xml::encoding::EncodingError),
#[error("html to text error: {0}")]
Html2TextError(#[from] html2text::Error),
}

710
server/src/graphql.rs Normal file
View File

@@ -0,0 +1,710 @@
use std::{fmt, str::FromStr};
use async_graphql::{
connection::{self, Connection, Edge, OpaqueCursor},
futures_util::Stream,
Context, Enum, Error, FieldResult, InputObject, Object, Schema, SimpleObject, Subscription,
Union,
};
use cacher::FilesystemCacher;
use futures::stream;
use letterbox_notmuch::Notmuch;
use serde::{Deserialize, Serialize};
use sqlx::postgres::PgPool;
use tokio::join;
use tracing::{info, instrument};
#[cfg(feature = "tantivy")]
use crate::tantivy::TantivyConnection;
use crate::{newsreader, nm, nm::label_unprocessed, Query};
/// # Number of seconds since the Epoch
pub type UnixTime = isize;
/// # Thread ID, sans "thread:"
pub type ThreadId = String;
#[derive(Debug, Enum, Copy, Clone, Eq, PartialEq)]
pub enum Corpus {
Notmuch,
Newsreader,
Tantivy,
}
impl FromStr for Corpus {
type Err = String;
fn from_str(s: &str) -> Result<Self, Self::Err> {
Ok(match s {
"notmuch" => Corpus::Notmuch,
"newsreader" => Corpus::Newsreader,
"tantivy" => Corpus::Tantivy,
s => return Err(format!("unknown corpus: '{s}'")),
})
}
}
// TODO: add is_read field and remove all use of 'tag:unread'
#[derive(Debug, SimpleObject)]
pub struct ThreadSummary {
pub thread: ThreadId,
pub timestamp: UnixTime,
/// user-friendly timestamp
pub date_relative: String,
/// number of matched messages
pub matched: isize,
/// total messages in thread
pub total: isize,
/// comma-separated names with | between matched and unmatched
pub authors: String,
pub subject: String,
pub tags: Vec<String>,
pub corpus: Corpus,
}
#[derive(Debug, Union)]
pub enum Thread {
Email(EmailThread),
News(NewsPost),
}
#[derive(Debug, SimpleObject)]
pub struct NewsPost {
pub thread_id: String,
pub is_read: bool,
pub slug: String,
pub site: String,
pub title: String,
pub body: String,
pub url: String,
pub timestamp: i64,
}
#[derive(Debug, SimpleObject)]
pub struct EmailThread {
pub thread_id: String,
pub subject: String,
pub messages: Vec<Message>,
}
#[derive(Debug, SimpleObject)]
pub struct Message {
// Message-ID for message, prepend `id:<id>` to search in notmuch
pub id: String,
// First From header found in email
pub from: Option<Email>,
// All To headers found in email
pub to: Vec<Email>,
// All CC headers found in email
pub cc: Vec<Email>,
// X-Original-To header found in email
pub x_original_to: Option<Email>,
// Delivered-To header found in email
pub delivered_to: Option<Email>,
// First Subject header found in email
pub subject: Option<String>,
// Parsed Date header, if found and valid
pub timestamp: Option<i64>,
// Headers
pub headers: Vec<Header>,
// The body contents
pub body: Body,
// On disk location of message
pub path: String,
pub attachments: Vec<Attachment>,
pub tags: Vec<String>,
}
// Content-Type: image/jpeg; name="PXL_20231125_204826860.jpg"
// Content-Disposition: attachment; filename="PXL_20231125_204826860.jpg"
// Content-Transfer-Encoding: base64
// Content-ID: <f_lponoluo1>
// X-Attachment-Id: f_lponoluo1
#[derive(Default, Debug, SimpleObject)]
pub struct Attachment {
pub id: String,
pub idx: String,
pub filename: Option<String>,
pub size: usize,
pub content_type: Option<String>,
pub content_id: Option<String>,
pub disposition: DispositionType,
pub bytes: Vec<u8>,
}
#[derive(Debug, Clone, Eq, PartialEq)]
pub struct Disposition {
pub r#type: DispositionType,
pub filename: Option<String>,
pub size: Option<usize>,
}
#[derive(Debug, Enum, Copy, Clone, Eq, PartialEq)]
pub enum DispositionType {
Inline,
Attachment,
}
impl From<mailparse::DispositionType> for DispositionType {
fn from(value: mailparse::DispositionType) -> Self {
match value {
mailparse::DispositionType::Inline => DispositionType::Inline,
mailparse::DispositionType::Attachment => DispositionType::Attachment,
dt => panic!("unhandled DispositionType {dt:?}"),
}
}
}
impl Default for DispositionType {
fn default() -> Self {
DispositionType::Attachment
}
}
#[derive(Debug, SimpleObject)]
pub struct Header {
pub key: String,
pub value: String,
}
#[derive(Debug)]
pub struct UnhandledContentType {
pub text: String,
pub content_tree: String,
}
#[Object]
impl UnhandledContentType {
async fn contents(&self) -> &str {
&self.text
}
async fn content_tree(&self) -> &str {
&self.content_tree
}
}
#[derive(Debug)]
pub struct PlainText {
pub text: String,
pub content_tree: String,
}
#[Object]
impl PlainText {
async fn contents(&self) -> &str {
&self.text
}
async fn content_tree(&self) -> &str {
&self.content_tree
}
}
#[derive(Debug)]
pub struct Html {
pub html: String,
pub content_tree: String,
}
#[Object]
impl Html {
async fn contents(&self) -> &str {
&self.html
}
async fn content_tree(&self) -> &str {
&self.content_tree
}
async fn headers(&self) -> Vec<Header> {
Vec::new()
}
}
#[derive(Debug, Union)]
pub enum Body {
UnhandledContentType(UnhandledContentType),
PlainText(PlainText),
Html(Html),
}
impl Body {
pub fn html(html: String) -> Body {
Body::Html(Html {
html,
content_tree: "".to_string(),
})
}
pub fn text(text: String) -> Body {
Body::PlainText(PlainText {
text,
content_tree: "".to_string(),
})
}
pub fn to_html(&self) -> Option<String> {
match self {
Body::Html(h) => Some(h.html.clone()),
Body::PlainText(p) => Some(format!("<pre>{}</pre>", html_escape::encode_text(&p.text))),
Body::UnhandledContentType(u) => {
Some(format!("<pre>{}</pre>", html_escape::encode_text(&u.text)))
}
}
}
pub fn to_html_content_tree(&self) -> Option<String> {
match self {
Body::Html(h) => Some(h.content_tree.clone()),
Body::PlainText(p) => Some(p.content_tree.clone()),
Body::UnhandledContentType(u) => Some(u.content_tree.clone()),
}
}
}
#[derive(Debug, SimpleObject)]
pub struct Email {
pub name: Option<String>,
pub addr: Option<String>,
pub photo_url: Option<String>,
}
impl fmt::Display for Email {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> Result<(), std::fmt::Error> {
match (&self.name, &self.addr) {
(Some(name), Some(addr)) => write!(f, "{name} <{addr}>")?,
(Some(name), None) => write!(f, "{name}")?,
(None, Some(addr)) => write!(f, "{addr}")?,
(None, None) => write!(f, "<UNKNOWN>")?,
}
Ok(())
}
}
#[derive(SimpleObject)]
pub struct Tag {
pub name: String,
pub fg_color: String,
pub bg_color: String,
pub unread: usize,
}
#[derive(Serialize, Deserialize, Debug, InputObject)]
struct SearchCursor {
newsreader_offset: i32,
notmuch_offset: i32,
#[cfg(feature = "tantivy")]
tantivy_offset: i32,
}
fn request_id() -> String {
let now = std::time::SystemTime::now();
let nanos = now
.duration_since(std::time::SystemTime::UNIX_EPOCH)
.unwrap_or_default()
.as_nanos();
format!("{nanos:x}")
}
pub struct QueryRoot;
#[Object]
impl QueryRoot {
async fn version<'ctx>(&self, _ctx: &Context<'ctx>) -> Result<String, Error> {
build_info::build_info!(fn bi);
Ok(letterbox_shared::build_version(bi))
}
#[instrument(skip_all, fields(query=query, rid=request_id()))]
async fn count<'ctx>(&self, ctx: &Context<'ctx>, query: String) -> Result<usize, Error> {
let nm = ctx.data_unchecked::<Notmuch>();
let pool = ctx.data_unchecked::<PgPool>();
#[cfg(feature = "tantivy")]
let tantivy = ctx.data_unchecked::<TantivyConnection>();
let newsreader_query: Query = query.parse()?;
let newsreader_count = newsreader::count(pool, &newsreader_query).await?;
let notmuch_count = nm::count(nm, &newsreader_query).await?;
#[cfg(feature = "tantivy")]
let tantivy_count = tantivy.count(&newsreader_query).await?;
#[cfg(not(feature = "tantivy"))]
let tantivy_count = 0;
let total = newsreader_count + notmuch_count + tantivy_count;
info!("count {newsreader_query:?} newsreader count {newsreader_count} notmuch count {notmuch_count} tantivy count {tantivy_count} total {total}");
Ok(total)
}
#[instrument(skip_all, fields(query=query, rid=request_id()))]
async fn catchup<'ctx>(
&self,
ctx: &Context<'ctx>,
query: String,
) -> Result<Vec<String>, Error> {
let nm = ctx.data_unchecked::<Notmuch>();
let pool = ctx.data_unchecked::<PgPool>();
compute_catchup_ids(nm, pool, &query).await
}
// TODO: this function doesn't get parallelism, possibly because notmuch is sync and blocks,
// rewrite that with tokio::process:Command
#[instrument(skip_all, fields(query=query, rid=request_id()))]
async fn search<'ctx>(
&self,
ctx: &Context<'ctx>,
after: Option<String>,
before: Option<String>,
first: Option<i32>,
last: Option<i32>,
query: String,
) -> Result<Connection<OpaqueCursor<SearchCursor>, ThreadSummary>, Error> {
info!("search({after:?} {before:?} {first:?} {last:?} {query:?})",);
let nm = ctx.data_unchecked::<Notmuch>();
let pool = ctx.data_unchecked::<PgPool>();
#[cfg(feature = "tantivy")]
let tantivy = ctx.data_unchecked::<TantivyConnection>();
Ok(connection::query(
after,
before,
first,
last,
|after: Option<OpaqueCursor<SearchCursor>>,
before: Option<OpaqueCursor<SearchCursor>>,
first: Option<usize>,
last: Option<usize>| async move {
info!(
"search(after {:?} before {:?} first {first:?} last {last:?} query: {query:?})",
after.as_ref().map(|v| &v.0),
before.as_ref().map(|v| &v.0)
);
let newsreader_after = after.as_ref().map(|sc| sc.newsreader_offset);
let notmuch_after = after.as_ref().map(|sc| sc.notmuch_offset);
#[cfg(feature = "tantivy")]
let tantivy_after = after.as_ref().map(|sc| sc.tantivy_offset);
let newsreader_before = before.as_ref().map(|sc| sc.newsreader_offset);
let notmuch_before = before.as_ref().map(|sc| sc.notmuch_offset);
#[cfg(feature = "tantivy")]
let tantivy_before = before.as_ref().map(|sc| sc.tantivy_offset);
let first = first.map(|v| v as i32);
let last = last.map(|v| v as i32);
let query: Query = query.parse()?;
info!("newsreader_query {query:?}");
let newsreader_fut = newsreader_search(
pool,
newsreader_after,
newsreader_before,
first,
last,
&query,
);
let notmuch_fut =
notmuch_search(nm, notmuch_after, notmuch_before, first, last, &query);
#[cfg(feature = "tantivy")]
let tantivy_fut = tantivy_search(
tantivy,
pool,
tantivy_after,
tantivy_before,
first,
last,
&query,
);
#[cfg(not(feature = "tantivy"))]
let tantivy_fut =
async { Ok::<Vec<ThreadSummaryCursor>, async_graphql::Error>(Vec::new()) };
let (newsreader_results, notmuch_results, tantivy_results) =
join!(newsreader_fut, notmuch_fut, tantivy_fut);
let newsreader_results = newsreader_results?;
let notmuch_results = notmuch_results?;
let tantivy_results = tantivy_results?;
info!(
"newsreader_results ({}) notmuch_results ({}) tantivy_results ({})",
newsreader_results.len(),
notmuch_results.len(),
tantivy_results.len()
);
let mut results: Vec<_> = newsreader_results
.into_iter()
.chain(notmuch_results)
.chain(tantivy_results)
.collect();
// The leading '-' is to reverse sort
results.sort_by_key(|item| match item {
ThreadSummaryCursor::Newsreader(_, ts) => -ts.timestamp,
ThreadSummaryCursor::Notmuch(_, ts) => -ts.timestamp,
#[cfg(feature = "tantivy")]
ThreadSummaryCursor::Tantivy(_, ts) => -ts.timestamp,
});
let mut has_next_page = before.is_some();
if let Some(first) = first {
let first = first as usize;
if results.len() > first {
has_next_page = true;
results.truncate(first);
}
}
let mut has_previous_page = after.is_some();
if let Some(last) = last {
let last = last as usize;
if results.len() > last {
has_previous_page = true;
results.truncate(last);
}
}
let mut connection = Connection::new(has_previous_page, has_next_page);
// Set starting offset as the value from cursor to preserve state if no results from a corpus survived the truncation
let mut newsreader_offset =
after.as_ref().map(|sc| sc.newsreader_offset).unwrap_or(0);
let mut notmuch_offset = after.as_ref().map(|sc| sc.notmuch_offset).unwrap_or(0);
#[cfg(feature = "tantivy")]
let tantivy_offset = after.as_ref().map(|sc| sc.tantivy_offset).unwrap_or(0);
info!(
"newsreader_offset ({}) notmuch_offset ({})",
newsreader_offset, notmuch_offset,
);
connection.edges.extend(results.into_iter().map(|item| {
let thread_summary;
match item {
ThreadSummaryCursor::Newsreader(offset, ts) => {
thread_summary = ts;
newsreader_offset = offset;
}
ThreadSummaryCursor::Notmuch(offset, ts) => {
thread_summary = ts;
notmuch_offset = offset;
}
#[cfg(feature = "tantivy")]
ThreadSummaryCursor::Tantivy(offset, ts) => {
thread_summary = ts;
tantivy_offset = offset;
}
}
let cur = OpaqueCursor(SearchCursor {
newsreader_offset,
notmuch_offset,
#[cfg(feature = "tantivy")]
tantivy_offset,
});
Edge::new(cur, thread_summary)
}));
Ok::<_, async_graphql::Error>(connection)
},
)
.await?)
}
#[instrument(skip_all, fields(rid=request_id()))]
async fn tags<'ctx>(&self, ctx: &Context<'ctx>) -> FieldResult<Vec<Tag>> {
let nm = ctx.data_unchecked::<Notmuch>();
let pool = ctx.data_unchecked::<PgPool>();
let needs_unread = ctx.look_ahead().field("unread").exists();
let mut tags = newsreader::tags(pool, needs_unread).await?;
tags.append(&mut nm::tags(nm, needs_unread)?);
Ok(tags)
}
#[instrument(skip_all, fields(thread_id=thread_id, rid=request_id()))]
async fn thread<'ctx>(&self, ctx: &Context<'ctx>, thread_id: String) -> Result<Thread, Error> {
let nm = ctx.data_unchecked::<Notmuch>();
let cacher = ctx.data_unchecked::<FilesystemCacher>();
let pool = ctx.data_unchecked::<PgPool>();
let debug_content_tree = ctx
.look_ahead()
.field("messages")
.field("body")
.field("contentTree")
.exists();
if newsreader::is_newsreader_thread(&thread_id) {
Ok(newsreader::thread(cacher, pool, thread_id).await?)
} else {
Ok(nm::thread(nm, pool, thread_id, debug_content_tree).await?)
}
}
}
#[derive(Debug)]
enum ThreadSummaryCursor {
Newsreader(i32, ThreadSummary),
Notmuch(i32, ThreadSummary),
#[cfg(feature = "tantivy")]
Tantivy(i32, ThreadSummary),
}
async fn newsreader_search(
pool: &PgPool,
after: Option<i32>,
before: Option<i32>,
first: Option<i32>,
last: Option<i32>,
query: &Query,
) -> Result<Vec<ThreadSummaryCursor>, async_graphql::Error> {
Ok(newsreader::search(pool, after, before, first, last, &query)
.await?
.into_iter()
.map(|(cur, ts)| ThreadSummaryCursor::Newsreader(cur, ts))
.collect())
}
async fn notmuch_search(
nm: &Notmuch,
after: Option<i32>,
before: Option<i32>,
first: Option<i32>,
last: Option<i32>,
query: &Query,
) -> Result<Vec<ThreadSummaryCursor>, async_graphql::Error> {
Ok(nm::search(nm, after, before, first, last, &query)
.await?
.into_iter()
.map(|(cur, ts)| ThreadSummaryCursor::Notmuch(cur, ts))
.collect())
}
#[cfg(feature = "tantivy")]
async fn tantivy_search(
tantivy: &TantivyConnection,
pool: &PgPool,
after: Option<i32>,
before: Option<i32>,
first: Option<i32>,
last: Option<i32>,
query: &Query,
) -> Result<Vec<ThreadSummaryCursor>, async_graphql::Error> {
Ok(tantivy
.search(pool, after, before, first, last, &query)
.await?
.into_iter()
.map(|(cur, ts)| ThreadSummaryCursor::Tantivy(cur, ts))
.collect())
}
pub struct MutationRoot;
#[Object]
impl MutationRoot {
#[instrument(skip_all, fields(query=query, unread=unread, rid=request_id()))]
async fn set_read_status<'ctx>(
&self,
ctx: &Context<'ctx>,
query: String,
unread: bool,
) -> Result<bool, Error> {
let nm = ctx.data_unchecked::<Notmuch>();
let pool = ctx.data_unchecked::<PgPool>();
#[cfg(feature = "tantivy")]
let tantivy = ctx.data_unchecked::<TantivyConnection>();
let query: Query = query.parse()?;
newsreader::set_read_status(pool, &query, unread).await?;
#[cfg(feature = "tantivy")]
tantivy.reindex_thread(pool, &query).await?;
nm::set_read_status(nm, &query, unread).await?;
Ok(true)
}
#[instrument(skip_all, fields(query=query, tag=tag, rid=request_id()))]
async fn tag_add<'ctx>(
&self,
ctx: &Context<'ctx>,
query: String,
tag: String,
) -> Result<bool, Error> {
let nm = ctx.data_unchecked::<Notmuch>();
info!("tag_add({tag}, {query})");
nm.tag_add(&tag, &query)?;
Ok(true)
}
#[instrument(skip_all, fields(query=query, tag=tag, rid=request_id()))]
async fn tag_remove<'ctx>(
&self,
ctx: &Context<'ctx>,
query: String,
tag: String,
) -> Result<bool, Error> {
let nm = ctx.data_unchecked::<Notmuch>();
info!("tag_remove({tag}, {query})");
nm.tag_remove(&tag, &query)?;
Ok(true)
}
/// Drop and recreate tantivy index. Warning this is slow
#[cfg(feature = "tantivy")]
async fn drop_and_load_index<'ctx>(&self, ctx: &Context<'ctx>) -> Result<bool, Error> {
let tantivy = ctx.data_unchecked::<TantivyConnection>();
let pool = ctx.data_unchecked::<PgPool>();
tantivy.drop_and_load_index()?;
tantivy.reindex_all(pool).await?;
Ok(true)
}
#[instrument(skip_all, fields(rid=request_id()))]
async fn refresh<'ctx>(&self, ctx: &Context<'ctx>) -> Result<bool, Error> {
let nm = ctx.data_unchecked::<Notmuch>();
let cacher = ctx.data_unchecked::<FilesystemCacher>();
let pool = ctx.data_unchecked::<PgPool>();
info!("{}", String::from_utf8_lossy(&nm.new()?));
newsreader::refresh(pool, cacher).await?;
// Process email labels
label_unprocessed(&nm, &pool, false, Some(10), "tag:unprocessed").await?;
#[cfg(feature = "tantivy")]
{
let tantivy = ctx.data_unchecked::<TantivyConnection>();
// TODO: parallelize
tantivy.refresh(pool).await?;
}
Ok(true)
}
}
pub struct SubscriptionRoot;
#[Subscription]
impl SubscriptionRoot {
async fn values(&self, _ctx: &Context<'_>) -> Result<impl Stream<Item = usize>, Error> {
Ok(stream::iter(0..10))
}
}
pub type GraphqlSchema = Schema<QueryRoot, MutationRoot, SubscriptionRoot>;
#[instrument(skip_all, fields(query=query))]
pub async fn compute_catchup_ids(
nm: &Notmuch,
pool: &PgPool,
query: &str,
) -> Result<Vec<String>, Error> {
let query: Query = query.parse()?;
// TODO: implement optimized versions of fetching just IDs
let newsreader_fut = newsreader_search(pool, None, None, None, None, &query);
let notmuch_fut = notmuch_search(nm, None, None, None, None, &query);
let (newsreader_results, notmuch_results) = join!(newsreader_fut, notmuch_fut);
let newsreader_results = newsreader_results?;
let notmuch_results = notmuch_results?;
info!(
"newsreader_results ({}) notmuch_results ({})",
newsreader_results.len(),
notmuch_results.len(),
);
let mut results: Vec<_> = newsreader_results
.into_iter()
.chain(notmuch_results)
.collect();
// The leading '-' is to reverse sort
results.sort_by_key(|item| match item {
ThreadSummaryCursor::Newsreader(_, ts) => -ts.timestamp,
ThreadSummaryCursor::Notmuch(_, ts) => -ts.timestamp,
});
let ids = results
.into_iter()
.map(|r| match r {
ThreadSummaryCursor::Newsreader(_, ts) => ts.thread,
ThreadSummaryCursor::Notmuch(_, ts) => ts.thread,
})
.collect();
Ok(ids)
}

View File

@@ -1,2 +1,963 @@
pub mod config;
pub mod email_extract;
pub mod error;
pub mod graphql;
pub mod newsreader;
pub mod nm;
pub mod ws;
#[cfg(feature = "tantivy")]
pub mod tantivy;
use std::{
collections::{HashMap, HashSet},
convert::Infallible,
fmt,
str::FromStr,
sync::Arc,
};
use async_trait::async_trait;
use cacher::{Cacher, FilesystemCacher};
use css_inline::{CSSInliner, InlineError, InlineOptions};
pub use error::ServerError;
use linkify::{LinkFinder, LinkKind};
use lol_html::{
element, errors::RewritingError, html_content::ContentType, rewrite_str, text,
RewriteStrSettings,
};
use maplit::{hashmap, hashset};
use regex::Regex;
use reqwest::StatusCode;
use scraper::{Html, Selector};
use sqlx::types::time::PrimitiveDateTime;
use thiserror::Error;
use tracing::{debug, error, info, warn};
use url::Url;
use crate::{
graphql::{Corpus, ThreadSummary},
newsreader::is_newsreader_thread,
nm::is_notmuch_thread_or_id,
};
const NEWSREADER_TAG_PREFIX: &'static str = "News/";
const NEWSREADER_THREAD_PREFIX: &'static str = "news:";
// TODO: figure out how to use Cow
#[async_trait]
trait Transformer: Send + Sync {
fn should_run(&self, _addr: &Option<Url>, _html: &str) -> bool {
true
}
// TODO: should html be something like `html_escape` uses:
// <S: ?Sized + AsRef<str>>(text: &S) -> Cow<str>
async fn transform(&self, addr: &Option<Url>, html: &str) -> Result<String, TransformError>;
}
// TODO: how would we make this more generic to allow good implementations of Transformer outside
// of this module?
#[derive(Error, Debug)]
pub enum TransformError {
#[error("lol-html rewrite error: {0}")]
RewritingError(#[from] RewritingError),
#[error("css inline error: {0}")]
InlineError(#[from] InlineError),
#[error("failed to fetch url error: {0}")]
ReqwestError(#[from] reqwest::Error),
#[error("failed to parse HTML: {0}")]
HtmlParsingError(String),
#[error("got a retryable error code {0} for {1}")]
RetryableHttpStatusError(StatusCode, String),
}
struct SanitizeHtml<'a> {
cid_prefix: &'a str,
base_url: &'a Option<Url>,
}
#[async_trait]
impl<'a> Transformer for SanitizeHtml<'a> {
async fn transform(&self, _: &Option<Url>, html: &str) -> Result<String, TransformError> {
Ok(sanitize_html(html, self.cid_prefix, self.base_url)?)
}
}
struct EscapeHtml;
#[async_trait]
impl Transformer for EscapeHtml {
fn should_run(&self, _: &Option<Url>, html: &str) -> bool {
html.contains("&")
}
async fn transform(&self, _: &Option<Url>, html: &str) -> Result<String, TransformError> {
Ok(html_escape::decode_html_entities(html).to_string())
}
}
struct StripHtml;
#[async_trait]
impl Transformer for StripHtml {
fn should_run(&self, link: &Option<Url>, html: &str) -> bool {
debug!("StripHtml should_run {link:?} {}", html.contains("<"));
// Lame test
html.contains("<")
}
async fn transform(&self, link: &Option<Url>, html: &str) -> Result<String, TransformError> {
debug!("StripHtml {link:?}");
let mut text = String::new();
let element_content_handlers = vec![
element!("style", |el| {
el.remove();
Ok(())
}),
element!("script", |el| {
el.remove();
Ok(())
}),
];
let html = rewrite_str(
html,
RewriteStrSettings {
element_content_handlers,
..RewriteStrSettings::default()
},
)?;
let element_content_handlers = vec![text!("*", |t| {
text += t.as_str();
Ok(())
})];
let _ = rewrite_str(
&html,
RewriteStrSettings {
element_content_handlers,
..RewriteStrSettings::default()
},
)?;
let re = Regex::new(r"\s+").expect("failed to parse regex");
let text = re.replace_all(&text, " ").to_string();
Ok(text)
}
}
struct InlineStyle;
#[async_trait]
impl Transformer for InlineStyle {
async fn transform(&self, _: &Option<Url>, html: &str) -> Result<String, TransformError> {
let css = concat!(
"/* chrome-default.css */\n",
include_str!("chrome-default.css"),
//"\n/* mvp.css */\n",
//include_str!("mvp.css"),
//"\n/* Xinu Specific overrides */\n",
//include_str!("custom.css"),
);
let inline_opts = InlineOptions {
inline_style_tags: true,
keep_style_tags: false,
keep_link_tags: true,
base_url: None,
load_remote_stylesheets: true,
extra_css: Some(css.into()),
preallocate_node_capacity: 32,
..InlineOptions::default()
};
//info!("HTML:\n{html}");
Ok(match CSSInliner::new(inline_opts).inline(&html) {
Ok(inlined_html) => inlined_html,
Err(err) => {
error!("failed to inline CSS: {err}");
html.to_string()
}
})
}
}
/// Process images will extract any alt or title tags on images and place them as labels below said
/// image. It also handles data-src and data-cfsrc attributes
struct FrameImages;
#[async_trait]
impl Transformer for FrameImages {
async fn transform(&self, _: &Option<Url>, html: &str) -> Result<String, TransformError> {
Ok(rewrite_str(
html,
RewriteStrSettings {
element_content_handlers: vec![
element!("img[data-src]", |el| {
let src = el
.get_attribute("data-src")
.unwrap_or("https://placehold.co/600x400".to_string());
el.set_attribute("src", &src)?;
Ok(())
}),
element!("img[data-cfsrc]", |el| {
let src = el
.get_attribute("data-cfsrc")
.unwrap_or("https://placehold.co/600x400".to_string());
el.set_attribute("src", &src)?;
Ok(())
}),
element!("img[alt], img[title]", |el| {
let src = el
.get_attribute("src")
.unwrap_or("https://placehold.co/600x400".to_string());
let alt = el.get_attribute("alt");
let title = el.get_attribute("title");
let mut frags =
vec!["<figure>".to_string(), format!(r#"<img src="{src}">"#)];
alt.map(|t| {
if !t.is_empty() {
frags.push(format!("<figcaption>Alt: {t}</figcaption>"))
}
});
title.map(|t| {
if !t.is_empty() {
frags.push(format!("<figcaption>Title: {t}</figcaption>"))
}
});
frags.push("</figure>".to_string());
el.replace(&frags.join("\n"), ContentType::Html);
Ok(())
}),
],
..RewriteStrSettings::default()
},
)?)
}
}
struct AddOutlink;
#[async_trait]
impl Transformer for AddOutlink {
fn should_run(&self, link: &Option<Url>, html: &str) -> bool {
if let Some(link) = link {
link.scheme().starts_with("http") && !html.contains(link.as_str())
} else {
false
}
}
async fn transform(&self, link: &Option<Url>, html: &str) -> Result<String, TransformError> {
if let Some(link) = link {
Ok(format!(
r#"
{html}
<div><a href="{}">View on site</a></div>
"#,
link
))
} else {
Ok(html.to_string())
}
}
}
struct SlurpContents<'c> {
cacher: &'c FilesystemCacher,
inline_css: bool,
site_selectors: HashMap<String, Vec<Selector>>,
}
impl<'c> SlurpContents<'c> {
fn get_selectors(&self, link: &Url) -> Option<&[Selector]> {
for (host, selector) in self.site_selectors.iter() {
if link.host_str().map(|h| h.contains(host)).unwrap_or(false) {
return Some(&selector);
}
}
None
}
}
#[async_trait]
impl<'c> Transformer for SlurpContents<'c> {
fn should_run(&self, link: &Option<Url>, html: &str) -> bool {
debug!("SlurpContents should_run {link:?}");
let mut will_slurp = false;
if let Some(link) = link {
will_slurp = self.get_selectors(link).is_some();
}
if !will_slurp && self.inline_css {
return InlineStyle {}.should_run(link, html);
}
will_slurp
}
async fn transform(&self, link: &Option<Url>, html: &str) -> Result<String, TransformError> {
debug!("SlurpContents {link:?}");
let retryable_status: HashSet<StatusCode> = vec![
StatusCode::UNAUTHORIZED,
StatusCode::FORBIDDEN,
StatusCode::REQUEST_TIMEOUT,
StatusCode::TOO_MANY_REQUESTS,
]
.into_iter()
.collect();
if let Some(test_link) = link {
// If SlurpContents is configured for inline CSS, but no
// configuration found for this site, use the local InlineStyle
// transform.
if self.inline_css && self.get_selectors(test_link).is_none() {
debug!("local inline CSS for {link:?}");
return InlineStyle {}.transform(link, html).await;
}
}
let Some(link) = link else {
return Ok(html.to_string());
};
let Some(selectors) = self.get_selectors(&link) else {
return Ok(html.to_string());
};
let cacher = self.cacher;
let body = if let Some(body) = cacher.get(link.as_str()) {
String::from_utf8_lossy(&body).to_string()
} else {
let resp = reqwest::get(link.as_str()).await?;
let status = resp.status();
if status.is_server_error() {
error!("status error for {link}: {status}");
return Ok(html.to_string());
}
if retryable_status.contains(&status) {
error!("retryable error for {link}: {status}");
return Ok(html.to_string());
}
if !status.is_success() {
error!("unsuccessful for {link}: {status}");
return Ok(html.to_string());
}
let body = resp.text().await?;
cacher.set(link.as_str(), body.as_bytes());
body
};
let body = Arc::new(body);
let base_url = Some(link.clone());
let body = if self.inline_css {
debug!("inlining CSS for {link}");
let inner_body = Arc::clone(&body);
let res = tokio::task::spawn_blocking(move || {
let css = concat!(
"/* chrome-default.css */\n",
include_str!("chrome-default.css"),
"\n/* vars.css */\n",
include_str!("../static/vars.css"),
//"\n/* Xinu Specific overrides */\n",
//include_str!("custom.css"),
);
let res = CSSInliner::options()
.base_url(base_url)
.extra_css(Some(std::borrow::Cow::Borrowed(css)))
.build()
.inline(&inner_body);
match res {
Ok(inlined_html) => inlined_html,
Err(err) => {
error!("failed to inline remote CSS: {err}");
Arc::into_inner(inner_body).expect("failed to take body out of Arc")
}
}
})
.await;
match res {
Ok(inlined_html) => inlined_html,
Err(err) => {
error!("failed to spawn inline remote CSS: {err}");
Arc::into_inner(body).expect("failed to take body out of Arc")
}
}
} else {
debug!("using body as-is for {link:?}");
Arc::into_inner(body).expect("failed to take body out of Arc")
};
let doc = Html::parse_document(&body);
let mut results = Vec::new();
for selector in selectors {
for frag in doc.select(&selector) {
results.push(frag.html())
// TODO: figure out how to warn if there were no hits
//warn!("couldn't find '{:?}' in {}", selector, link);
}
}
Ok(results.join("<br>"))
}
}
pub fn linkify_html(text: &str) -> String {
let mut finder = LinkFinder::new();
let finder = finder.url_must_have_scheme(false).kinds(&[LinkKind::Url]);
let mut parts = Vec::new();
for span in finder.spans(text) {
// TODO(wathiede): use Cow<str>?
match span.kind() {
// Text as-is
None => parts.push(span.as_str().to_string()),
// Wrap in anchor tag
Some(LinkKind::Url) => {
let text = span.as_str();
let schema = if text.starts_with("http") {
""
} else {
"http://"
};
let a = format!(r#"<a href="{schema}{0}">{0}</a>"#, text);
parts.push(a);
}
_ => todo!("unhandled kind: {:?}", span.kind().unwrap()),
}
}
parts.join("")
}
// html contains the content to be cleaned, and cid_prefix is used to resolve mixed part image
// referrences
pub fn sanitize_html(
html: &str,
cid_prefix: &str,
base_url: &Option<Url>,
) -> Result<String, TransformError> {
let inline_opts = InlineOptions {
inline_style_tags: true,
keep_style_tags: true,
keep_link_tags: false,
base_url: None,
load_remote_stylesheets: false,
extra_css: None,
preallocate_node_capacity: 32,
..InlineOptions::default()
};
let html = match CSSInliner::new(inline_opts).inline(&html) {
Ok(inlined_html) => inlined_html,
Err(err) => {
error!("failed to inline CSS: {err}");
html.to_string()
}
};
let mut element_content_handlers = vec![
// Remove width and height attributes on elements
element!("[width],[height]", |el| {
el.remove_attribute("width");
el.remove_attribute("height");
Ok(())
}),
// Remove width and height values from inline styles
element!("[style]", |el| {
let style = el.get_attribute("style").unwrap();
let style = style
.split(";")
.filter(|s| {
let Some((k, _)) = s.split_once(':') else {
return true;
};
match k {
"width" | "max-width" | "min-width" | "height" | "max-height"
| "min-height" => false,
_ => true,
}
})
.collect::<Vec<_>>()
.join(";");
if let Err(e) = el.set_attribute("style", &style) {
error!("Failed to set style attribute: {e}");
}
Ok(())
}),
// Open links in new tab
element!("a[href]", |el| {
el.set_attribute("target", "_blank").unwrap();
Ok(())
}),
// Replace mixed part CID images with URL
element!("img[src]", |el| {
let src = el
.get_attribute("src")
.expect("src was required")
.replace("cid:", cid_prefix);
el.set_attribute("src", &src)?;
Ok(())
}),
// Only secure image URLs
element!("img[src]", |el| {
let src = el
.get_attribute("src")
.expect("src was required")
.replace("http:", "https:");
el.set_attribute("src", &src)?;
Ok(())
}),
// Add https to href with //<domain name>
element!("link[href]", |el| {
info!("found link[href] {el:?}");
let mut href = el.get_attribute("href").expect("href was required");
if href.starts_with("//") {
warn!("adding https to {href}");
href.insert_str(0, "https:");
}
el.set_attribute("href", &href)?;
Ok(())
}),
// Add https to src with //<domain name>
element!("style[src]", |el| {
let mut src = el.get_attribute("src").expect("src was required");
if src.starts_with("//") {
src.insert_str(0, "https:");
}
el.set_attribute("src", &src)?;
Ok(())
}),
];
if let Some(base_url) = base_url {
element_content_handlers.extend(vec![
// Make links with relative URLs absolute
element!("a[href]", |el| {
if let Some(Ok(href)) = el.get_attribute("href").map(|href| base_url.join(&href)) {
el.set_attribute("href", &href.as_str()).unwrap();
}
Ok(())
}),
// Make images with relative srcs absolute
element!("img[src]", |el| {
if let Some(Ok(src)) = el.get_attribute("src").map(|src| base_url.join(&src)) {
el.set_attribute("src", &src.as_str()).unwrap();
}
Ok(())
}),
]);
}
let html = rewrite_str(
&html,
RewriteStrSettings {
element_content_handlers,
..RewriteStrSettings::default()
},
)?;
// Default's don't allow style, but we want to preserve that.
// TODO: remove 'class' if rendering mails moves to a two phase process where abstract message
// types are collected, santized, and then grouped together as one big HTML doc
let attributes = hashset![
"align", "bgcolor", "class", "color", "height", "lang", "title", "width", "style",
];
let tags = hashset![
"a",
"abbr",
"acronym",
"area",
"article",
"aside",
"b",
"bdi",
"bdo",
"blockquote",
"br",
"caption",
"center",
"cite",
"code",
"col",
"colgroup",
"data",
"dd",
"del",
"details",
"dfn",
"div",
"dl",
"dt",
"em",
"figcaption",
"figure",
"footer",
"h1",
"h2",
"h3",
"h4",
"h5",
"h6",
"header",
"hgroup",
"hr",
"i",
"iframe", // wathiede
"img",
"ins",
"kbd",
"kbd",
"li",
"map",
"mark",
"nav",
"noscript", // wathiede
"ol",
"p",
"pre",
"q",
"rp",
"rt",
"rtc",
"ruby",
"s",
"samp",
"small",
"span",
"strike",
"strong",
"sub",
"summary",
"sup",
"table",
"tbody",
"td",
"th",
"thead",
"time",
"title", // wathiede
"tr",
"tt",
"u",
"ul",
"var",
"wbr",
];
let tag_attributes = hashmap![
"a" => hashset![
"href", "hreflang", "target",
],
"bdo" => hashset![
"dir"
],
"blockquote" => hashset![
"cite"
],
"col" => hashset![
"align", "char", "charoff", "span"
],
"colgroup" => hashset![
"align", "char", "charoff", "span"
],
"del" => hashset![
"cite", "datetime"
],
"hr" => hashset![
"align", "size", "width"
],
"iframe" => hashset![
"src", "allow", "allowfullscreen"
],
"img" => hashset![
"align", "alt", "height", "src", "width"
],
"ins" => hashset![
"cite", "datetime"
],
"ol" => hashset![
"start"
],
"q" => hashset![
"cite"
],
"table" => hashset![
"align", "border", "cellpadding", "cellspacing", "char", "charoff", "summary",
],
"tbody" => hashset![
"align", "char", "charoff"
],
"td" => hashset![
"align", "char", "charoff", "colspan", "headers", "rowspan"
],
"tfoot" => hashset![
"align", "char", "charoff"
],
"th" => hashset![
"align", "char", "charoff", "colspan", "headers", "rowspan", "scope"
],
"thead" => hashset![
"align", "char", "charoff"
],
"tr" => hashset![
"align", "char", "charoff"
],
];
let html = ammonia::Builder::default()
.tags(tags)
.tag_attributes(tag_attributes)
.generic_attributes(attributes)
.clean(&html)
.to_string();
Ok(html)
}
fn compute_offset_limit(
after: Option<i32>,
before: Option<i32>,
first: Option<i32>,
last: Option<i32>,
) -> (i32, i32) {
let default_page_size = 10000;
match (after, before, first, last) {
// Reasonable defaults
(None, None, None, None) => (0, default_page_size),
(None, None, Some(first), None) => (0, first),
(Some(after), None, None, None) => (after + 1, default_page_size),
(Some(after), None, Some(first), None) => (after + 1, first),
(None, Some(before), None, None) => (0.max(before - default_page_size), default_page_size),
(None, Some(before), None, Some(last)) => (0.max(before - last), last),
(None, None, None, Some(_)) => {
panic!("specifying last and no before doesn't make sense")
}
(None, None, Some(_), Some(_)) => {
panic!("specifying first and last doesn't make sense")
}
(None, Some(_), Some(_), _) => {
panic!("specifying before and first doesn't make sense")
}
(Some(_), Some(_), _, _) => {
panic!("specifying after and before doesn't make sense")
}
(Some(_), None, None, Some(_)) => {
panic!("specifying after and last doesn't make sense")
}
(Some(_), None, Some(_), Some(_)) => {
panic!("specifying after, first and last doesn't make sense")
}
}
}
#[derive(Debug, Default)]
pub struct Query {
pub unread_only: bool,
pub tags: Vec<String>,
pub uids: Vec<String>,
pub remainder: Vec<String>,
pub is_notmuch: bool,
pub is_newsreader: bool,
pub is_tantivy: bool,
pub corpus: Option<Corpus>,
}
impl fmt::Display for Query {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> Result<(), std::fmt::Error> {
if self.unread_only {
write!(f, "is:unread ")?;
}
for tag in &self.tags {
write!(f, "tag:{tag} ")?;
}
for uid in &self.uids {
write!(f, "id:{uid} ")?;
}
if self.is_notmuch {
write!(f, "is:mail ")?;
}
if self.is_newsreader {
write!(f, "is:newsreader ")?;
}
if self.is_newsreader {
write!(f, "is:news ")?;
}
match self.corpus {
Some(c) => write!(f, "corpus:{c:?}")?,
_ => (),
}
for rem in &self.remainder {
write!(f, "{rem} ")?;
}
Ok(())
}
}
impl Query {
// Converts the internal state of Query to something suitable for notmuch queries. Removes and
// letterbox specific '<key>:<value' tags
fn to_notmuch(&self) -> String {
let mut parts = Vec::new();
if !self.is_notmuch {
return String::new();
}
if self.unread_only {
parts.push("is:unread".to_string());
}
for tag in &self.tags {
parts.push(format!("tag:{tag}"));
}
for uid in &self.uids {
parts.push(uid.clone());
}
for r in &self.remainder {
// Rewrite "to:" to include ExtraTo:. ExtraTo: is configured in
// notmuch-config to index Delivered-To and X-Original-To headers.
if r.starts_with("to:") {
parts.push("(".to_string());
parts.push(r.to_string());
parts.push("OR".to_string());
parts.push(r.replace("to:", "ExtraTo:"));
parts.push(")".to_string());
} else {
parts.push(r.to_string());
}
}
parts.join(" ")
}
}
impl FromStr for Query {
type Err = Infallible;
fn from_str(s: &str) -> Result<Self, Self::Err> {
let mut unread_only = false;
let mut tags = Vec::new();
let mut uids = Vec::new();
let mut remainder = Vec::new();
let mut is_notmuch = false;
let mut is_newsreader = false;
let mut is_tantivy = false;
let mut corpus = None;
for word in s.split_whitespace() {
if word == "is:unread" {
unread_only = true
} else if word.starts_with("tag:") {
let t = &word["tag:".len()..];
// Per-address emails are faked as `tag:@<domain>/<username>`, rewrite to `to:` form
if t.starts_with('@') && t.contains('.') {
let t = match t.split_once('/') {
None => format!("to:{t}"),
Some((domain, user)) => format!("to:{user}{domain}"),
};
remainder.push(t);
} else {
tags.push(t.to_string());
};
/*
} else if word.starts_with("tag:") {
// Any tag that doesn't match site_prefix should explicitly set the site to something not in the
// database
site = Some(NON_EXISTENT_SITE_NAME.to_string());
*/
} else if word.starts_with("corpus:") {
let c = word["corpus:".len()..].to_string();
corpus = c.parse::<Corpus>().map(|c| Some(c)).unwrap_or_else(|e| {
warn!("Error parsing corpus '{c}': {e:?}");
None
});
} else if is_newsreader_thread(word) {
uids.push(word.to_string());
} else if is_notmuch_thread_or_id(word) {
uids.push(word.to_string());
} else if word == "is:mail" || word == "is:email" || word == "is:notmuch" {
is_notmuch = true;
} else if word == "is:news" {
is_newsreader = true;
} else if word == "is:newsreader" {
is_newsreader = true;
} else {
remainder.push(word.to_string());
}
}
// If we don't see any explicit filters for a corpus, flip them all on
if corpus.is_none() && !(is_notmuch || is_tantivy || is_newsreader) {
is_notmuch = true;
is_newsreader = true;
is_tantivy = true;
}
Ok(Query {
unread_only,
tags,
uids,
remainder,
is_notmuch,
is_newsreader,
is_tantivy,
corpus,
})
}
}
pub struct ThreadSummaryRecord {
pub site: Option<String>,
pub date: Option<PrimitiveDateTime>,
pub is_read: Option<bool>,
pub title: Option<String>,
pub uid: String,
pub name: Option<String>,
pub corpus: Corpus,
}
async fn thread_summary_from_row(r: ThreadSummaryRecord) -> ThreadSummary {
let site = r.site.unwrap_or("UNKOWN TAG".to_string());
let mut tags = vec![format!("{NEWSREADER_TAG_PREFIX}{site}")];
if !r.is_read.unwrap_or(true) {
tags.push("unread".to_string());
};
let mut title = r.title.unwrap_or("NO TITLE".to_string());
title = clean_title(&title).await.expect("failed to clean title");
ThreadSummary {
thread: format!("{NEWSREADER_THREAD_PREFIX}{}", r.uid),
timestamp: r
.date
.expect("post missing date")
.assume_utc()
.unix_timestamp() as isize,
date_relative: format!("{:?}", r.date),
//date_relative: "TODO date_relative".to_string(),
matched: 0,
total: 1,
authors: r.name.unwrap_or_else(|| site.clone()),
subject: title,
tags,
corpus: r.corpus,
}
}
async fn clean_title(title: &str) -> Result<String, ServerError> {
// Make title HTML so html parsers work
let mut title = format!("<html>{title}</html>");
let title_tranformers: Vec<Box<dyn Transformer>> =
vec![Box::new(EscapeHtml), Box::new(StripHtml)];
// Make title HTML so html parsers work
title = format!("<html>{title}</html>");
for t in title_tranformers.iter() {
if t.should_run(&None, &title) {
title = t.transform(&None, &title).await?;
}
}
Ok(title)
}
#[cfg(test)]
mod tests {
use super::{SanitizeHtml, Transformer};
#[tokio::test]
async fn strip_sizes() -> Result<(), Box<dyn std::error::Error>> {
let ss = SanitizeHtml {
cid_prefix: "",
base_url: &None,
};
let input = r#"<p width=16 height=16 style="color:blue;width:16px;height:16px;">This el has width and height attributes and inline styles</p>"#;
let want = r#"<p style="color:blue;">This el has width and height attributes and inline styles</p>"#;
let got = ss.transform(&None, input).await?;
assert_eq!(got, want);
Ok(())
}
}

498
server/src/mvp.css Normal file
View File

@@ -0,0 +1,498 @@
/* MVP.css v1.15 - https://github.com/andybrewer/mvp */
/* :root content stored in client side index.html */
html {
scroll-behavior: smooth;
}
@media (prefers-reduced-motion: reduce) {
html {
scroll-behavior: auto;
}
}
/* Layout */
article aside {
background: var(--color-secondary-accent);
border-left: 4px solid var(--color-secondary);
padding: 0.01rem 0.8rem;
}
body {
background: var(--color-bg);
color: var(--color-text);
font-family: var(--font-family);
line-height: var(--line-height);
margin: 0;
overflow-x: hidden;
padding: 0;
}
footer,
header,
main {
margin: 0 auto;
max-width: var(--width-content);
padding: 3rem 1rem;
}
hr {
background-color: var(--color-bg-secondary);
border: none;
height: 1px;
margin: 4rem 0;
width: 100%;
}
section {
display: flex;
flex-wrap: wrap;
justify-content: var(--justify-important);
}
section img,
article img {
max-width: 100%;
}
section pre {
overflow: auto;
}
section aside {
border: 1px solid var(--color-bg-secondary);
border-radius: var(--border-radius);
box-shadow: var(--box-shadow) var(--color-shadow);
margin: 1rem;
padding: 1.25rem;
width: var(--width-card);
}
section aside:hover {
box-shadow: var(--box-shadow) var(--color-bg-secondary);
}
[hidden] {
display: none;
}
/* Headers */
article header,
div header,
main header {
padding-top: 0;
}
header {
text-align: var(--justify-important);
}
header a b,
header a em,
header a i,
header a strong {
margin-left: 0.5rem;
margin-right: 0.5rem;
}
header nav img {
margin: 1rem 0;
}
section header {
padding-top: 0;
width: 100%;
}
/* Nav */
nav {
align-items: center;
display: flex;
font-weight: bold;
justify-content: space-between;
margin-bottom: 7rem;
}
nav ul {
list-style: none;
padding: 0;
}
nav ul li {
display: inline-block;
margin: 0 0.5rem;
position: relative;
text-align: left;
}
/* Nav Dropdown */
nav ul li:hover ul {
display: block;
}
nav ul li ul {
background: var(--color-bg);
border: 1px solid var(--color-bg-secondary);
border-radius: var(--border-radius);
box-shadow: var(--box-shadow) var(--color-shadow);
display: none;
height: auto;
left: -2px;
padding: .5rem 1rem;
position: absolute;
top: 1.7rem;
white-space: nowrap;
width: auto;
z-index: 1;
}
nav ul li ul::before {
/* fill gap above to make mousing over them easier */
content: "";
position: absolute;
left: 0;
right: 0;
top: -0.5rem;
height: 0.5rem;
}
nav ul li ul li,
nav ul li ul li a {
display: block;
}
/* Typography */
code,
samp {
background-color: var(--color-accent);
border-radius: var(--border-radius);
color: var(--color-text);
display: inline-block;
margin: 0 0.1rem;
padding: 0 0.5rem;
}
details {
margin: 1.3rem 0;
}
details summary {
font-weight: bold;
cursor: pointer;
}
h1,
h2,
h3,
h4,
h5,
h6 {
line-height: var(--line-height);
text-wrap: balance;
}
mark {
padding: 0.1rem;
}
ol li,
ul li {
padding: 0.2rem 0;
}
p {
margin: 0.75rem 0;
padding: 0;
width: 100%;
}
pre {
margin: 1rem 0;
max-width: var(--width-card-wide);
padding: 1rem 0;
}
pre code,
pre samp {
display: block;
max-width: var(--width-card-wide);
padding: 0.5rem 2rem;
white-space: pre-wrap;
}
small {
color: var(--color-text-secondary);
}
sup {
background-color: var(--color-secondary);
border-radius: var(--border-radius);
color: var(--color-bg);
font-size: xx-small;
font-weight: bold;
margin: 0.2rem;
padding: 0.2rem 0.3rem;
position: relative;
top: -2px;
}
/* Links */
a {
color: var(--color-link);
display: inline-block;
font-weight: bold;
text-decoration: underline;
}
a:hover {
filter: brightness(var(--hover-brightness));
}
a:active {
filter: brightness(var(--active-brightness));
}
a b,
a em,
a i,
a strong,
button,
input[type="submit"] {
border-radius: var(--border-radius);
display: inline-block;
font-size: medium;
font-weight: bold;
line-height: var(--line-height);
margin: 0.5rem 0;
padding: 1rem 2rem;
}
button,
input[type="submit"] {
font-family: var(--font-family);
}
button:hover,
input[type="submit"]:hover {
cursor: pointer;
filter: brightness(var(--hover-brightness));
}
button:active,
input[type="submit"]:active {
filter: brightness(var(--active-brightness));
}
a b,
a strong,
button,
input[type="submit"] {
background-color: var(--color-link);
border: 2px solid var(--color-link);
color: var(--color-bg);
}
a em,
a i {
border: 2px solid var(--color-link);
border-radius: var(--border-radius);
color: var(--color-link);
display: inline-block;
padding: 1rem 2rem;
}
article aside a {
color: var(--color-secondary);
}
/* Images */
figure {
margin: 0;
padding: 0;
}
figure img {
max-width: 100%;
}
figure figcaption {
color: var(--color-text-secondary);
}
/* Forms */
button:disabled,
input:disabled {
background: var(--color-bg-secondary);
border-color: var(--color-bg-secondary);
color: var(--color-text-secondary);
cursor: not-allowed;
}
button[disabled]:hover,
input[type="submit"][disabled]:hover {
filter: none;
}
form {
border: 1px solid var(--color-bg-secondary);
border-radius: var(--border-radius);
box-shadow: var(--box-shadow) var(--color-shadow);
display: block;
max-width: var(--width-card-wide);
min-width: var(--width-card);
padding: 1.5rem;
text-align: var(--justify-normal);
}
form header {
margin: 1.5rem 0;
padding: 1.5rem 0;
}
input,
label,
select,
textarea {
display: block;
font-size: inherit;
max-width: var(--width-card-wide);
}
input[type="checkbox"],
input[type="radio"] {
display: inline-block;
}
input[type="checkbox"]+label,
input[type="radio"]+label {
display: inline-block;
font-weight: normal;
position: relative;
top: 1px;
}
input[type="range"] {
padding: 0.4rem 0;
}
input,
select,
textarea {
border: 1px solid var(--color-bg-secondary);
border-radius: var(--border-radius);
margin-bottom: 1rem;
padding: 0.4rem 0.8rem;
}
input[type="text"],
input[type="password"] textarea {
width: calc(100% - 1.6rem);
}
input[readonly],
textarea[readonly] {
background-color: var(--color-bg-secondary);
}
label {
font-weight: bold;
margin-bottom: 0.2rem;
}
/* Popups */
dialog {
border: 1px solid var(--color-bg-secondary);
border-radius: var(--border-radius);
box-shadow: var(--box-shadow) var(--color-shadow);
position: fixed;
top: 50%;
left: 50%;
transform: translate(-50%, -50%);
width: 50%;
z-index: 999;
}
/* Tables */
table {
border: 1px solid var(--color-bg-secondary);
border-radius: var(--border-radius);
border-spacing: 0;
display: inline-block;
max-width: 100%;
overflow-x: auto;
padding: 0;
white-space: nowrap;
}
table td,
table th,
table tr {
padding: 0.4rem 0.8rem;
text-align: var(--justify-important);
}
table thead {
background-color: var(--color-table);
border-collapse: collapse;
border-radius: var(--border-radius);
color: var(--color-bg);
margin: 0;
padding: 0;
}
table thead tr:first-child th:first-child {
border-top-left-radius: var(--border-radius);
}
table thead tr:first-child th:last-child {
border-top-right-radius: var(--border-radius);
}
table thead th:first-child,
table tr td:first-child {
text-align: var(--justify-normal);
}
table tr:nth-child(even) {
background-color: var(--color-accent);
}
/* Quotes */
blockquote {
display: block;
font-size: x-large;
line-height: var(--line-height);
margin: 1rem auto;
max-width: var(--width-card-medium);
padding: 1.5rem 1rem;
text-align: var(--justify-important);
}
blockquote footer {
color: var(--color-text-secondary);
display: block;
font-size: small;
line-height: var(--line-height);
padding: 1.5rem 0;
}
/* Scrollbars */
* {
scrollbar-width: thin;
scrollbar-color: var(--color-scrollbar) transparent;
}
*::-webkit-scrollbar {
width: 5px;
height: 5px;
}
*::-webkit-scrollbar-track {
background: transparent;
}
*::-webkit-scrollbar-thumb {
background-color: var(--color-scrollbar);
border-radius: 10px;
}

386
server/src/newsreader.rs Normal file
View File

@@ -0,0 +1,386 @@
use std::collections::HashMap;
use cacher::FilesystemCacher;
use futures::{stream::FuturesUnordered, StreamExt};
use letterbox_shared::compute_color;
use maplit::hashmap;
use scraper::Selector;
use sqlx::postgres::PgPool;
use tracing::{error, info, instrument};
use url::Url;
use crate::{
clean_title, compute_offset_limit,
error::ServerError,
graphql::{Corpus, NewsPost, Tag, Thread, ThreadSummary},
thread_summary_from_row, AddOutlink, FrameImages, Query, SanitizeHtml, SlurpContents,
StripHtml, ThreadSummaryRecord, Transformer, NEWSREADER_TAG_PREFIX, NEWSREADER_THREAD_PREFIX,
};
pub fn is_newsreader_query(query: &Query) -> bool {
query.is_newsreader || query.corpus == Some(Corpus::Newsreader)
}
pub fn is_newsreader_thread(query: &str) -> bool {
query.starts_with(NEWSREADER_THREAD_PREFIX)
}
pub fn extract_thread_id(query: &str) -> &str {
if query.starts_with(NEWSREADER_THREAD_PREFIX) {
&query[NEWSREADER_THREAD_PREFIX.len()..]
} else {
query
}
}
pub fn extract_site(tag: &str) -> &str {
&tag[NEWSREADER_TAG_PREFIX.len()..]
}
pub fn make_news_tag(tag: &str) -> String {
format!("tag:{NEWSREADER_TAG_PREFIX}{tag}")
}
fn site_from_tags(tags: &[String]) -> Option<String> {
for t in tags {
if t.starts_with(NEWSREADER_TAG_PREFIX) {
return Some(extract_site(t).to_string());
}
}
None
}
#[instrument(name = "newsreader::count", skip_all, fields(query=%query))]
pub async fn count(pool: &PgPool, query: &Query) -> Result<usize, ServerError> {
if !is_newsreader_query(query) {
return Ok(0);
}
let site = site_from_tags(&query.tags);
if !query.tags.is_empty() && site.is_none() {
// Newsreader can only handle all sites read/unread queries, anything with a non-site tag
// isn't supported
return Ok(0);
}
let search_term = query.remainder.join(" ");
let search_term = search_term.trim();
let search_term = if search_term.is_empty() {
None
} else {
Some(search_term)
};
// TODO: add support for looking for search_term in title and site
let row = sqlx::query_file!("sql/count.sql", site, query.unread_only, search_term)
.fetch_one(pool)
.await?;
Ok(row.count.unwrap_or(0).try_into().unwrap_or(0))
}
#[instrument(name = "newsreader::search", skip_all, fields(query=%query))]
pub async fn search(
pool: &PgPool,
after: Option<i32>,
before: Option<i32>,
first: Option<i32>,
last: Option<i32>,
query: &Query,
) -> Result<Vec<(i32, ThreadSummary)>, async_graphql::Error> {
info!("search({after:?} {before:?} {first:?} {last:?} {query:?}");
if !is_newsreader_query(query) {
return Ok(Vec::new());
}
let site = site_from_tags(&query.tags);
if !query.tags.is_empty() && site.is_none() {
// Newsreader can only handle all sites read/unread queries, anything with a non-site tag
// isn't supported
return Ok(Vec::new());
}
let (offset, mut limit) = compute_offset_limit(after, before, first, last);
if before.is_none() {
// When searching forward, the +1 is to see if there are more pages of data available.
// Searching backwards implies there's more pages forward, because the value represented by
// `before` is on the next page.
limit = limit + 1;
}
info!(
"search offset {offset} limit {limit} site {site:?} unread_only {}",
query.unread_only
);
let search_term = query.remainder.join(" ");
let search_term = search_term.trim();
let search_term = if search_term.is_empty() {
None
} else {
Some(search_term)
};
// TODO: add support for looking for search_term in title and site
let rows = sqlx::query_file!(
"sql/threads.sql",
site,
query.unread_only,
offset as i64,
limit as i64,
search_term
)
.fetch_all(pool)
.await?;
let mut res = Vec::new();
for (i, r) in rows.into_iter().enumerate() {
res.push((
i as i32 + offset,
thread_summary_from_row(ThreadSummaryRecord {
site: r.site,
date: r.date,
is_read: r.is_read,
title: r.title,
uid: r.uid,
name: r.name,
corpus: Corpus::Newsreader,
})
.await,
));
}
Ok(res)
}
#[instrument(name = "newsreader::tags", skip_all, fields(needs_unread=%_needs_unread))]
pub async fn tags(pool: &PgPool, _needs_unread: bool) -> Result<Vec<Tag>, ServerError> {
// TODO: optimize query by using needs_unread
let tags = sqlx::query_file!("sql/tags.sql").fetch_all(pool).await?;
let tags = tags
.into_iter()
.map(|tag| {
let unread = tag.unread.unwrap_or(0).try_into().unwrap_or(0);
let name = format!(
"{NEWSREADER_TAG_PREFIX}{}",
tag.site.expect("tag must have site")
);
let hex = compute_color(&name);
Tag {
name,
fg_color: "white".to_string(),
bg_color: hex,
unread,
}
})
.collect();
Ok(tags)
}
#[instrument(name = "newsreader::thread", skip_all, fields(thread_id=%thread_id))]
pub async fn thread(
cacher: &FilesystemCacher,
pool: &PgPool,
thread_id: String,
) -> Result<Thread, ServerError> {
let id = thread_id
.strip_prefix(NEWSREADER_THREAD_PREFIX)
.expect("news thread doesn't start with '{NEWSREADER_THREAD_PREFIX}'")
.to_string();
let r = sqlx::query_file!("sql/thread.sql", id)
.fetch_one(pool)
.await?;
let slug = r.site.unwrap_or("no-slug".to_string());
let site = r.name.unwrap_or("NO SITE".to_string());
// TODO: remove the various places that have this as an Option
let link = Some(Url::parse(&r.link)?);
let mut body = r.clean_summary.unwrap_or("NO SUMMARY".to_string());
let body_transformers: Vec<Box<dyn Transformer>> = vec![
Box::new(SlurpContents {
cacher,
inline_css: true,
site_selectors: slurp_contents_selectors(),
}),
Box::new(FrameImages),
Box::new(AddOutlink),
// TODO: causes doubling of images in cloudflare blogs
//Box::new(EscapeHtml),
Box::new(SanitizeHtml {
cid_prefix: "",
base_url: &link,
}),
];
for t in body_transformers.iter() {
if t.should_run(&link, &body) {
body = t.transform(&link, &body).await?;
}
}
let title = clean_title(&r.title.unwrap_or("NO TITLE".to_string())).await?;
let is_read = r.is_read.unwrap_or(false);
let timestamp = r
.date
.expect("post missing date")
.assume_utc()
.unix_timestamp();
Ok(Thread::News(NewsPost {
thread_id,
is_read,
slug,
site,
title,
body,
url: link
.as_ref()
.map(|url| url.to_string())
.unwrap_or("NO URL".to_string()),
timestamp,
}))
}
#[instrument(name = "newsreader::set_read_status", skip_all, fields(query=%query,unread=%unread))]
pub async fn set_read_status<'ctx>(
pool: &PgPool,
query: &Query,
unread: bool,
) -> Result<bool, ServerError> {
// TODO: make single query when query.uids.len() > 1
let uids: Vec<_> = query
.uids
.iter()
.filter(|uid| is_newsreader_thread(uid))
.map(
|uid| extract_thread_id(uid), // TODO strip prefix
)
.collect();
for uid in uids {
sqlx::query_file!("sql/set_unread.sql", !unread, uid)
.execute(pool)
.await?;
}
Ok(true)
}
#[instrument(name = "newsreader::refresh", skip_all)]
pub async fn refresh<'ctx>(pool: &PgPool, cacher: &FilesystemCacher) -> Result<bool, ServerError> {
async fn update_search_summary(
pool: &PgPool,
cacher: &FilesystemCacher,
link: String,
body: String,
id: i32,
) -> Result<(), ServerError> {
let slurp_contents = SlurpContents {
cacher,
inline_css: true,
site_selectors: slurp_contents_selectors(),
};
let strip_html = StripHtml;
info!("adding {link} to search index");
let mut body = body;
if let Ok(link) = Url::parse(&link) {
let link = Some(link);
if slurp_contents.should_run(&link, &body) {
body = slurp_contents.transform(&link, &body).await?;
}
} else {
error!("failed to parse link: {}", link);
}
body = strip_html.transform(&None, &body).await?;
sqlx::query!(
"UPDATE post SET search_summary = $1 WHERE id = $2",
body,
id
)
.execute(pool)
.await?;
Ok(())
}
let mut unordered: FuturesUnordered<_> = sqlx::query_file!("sql/need-search-summary.sql",)
.fetch_all(pool)
.await?
.into_iter()
.filter_map(|r| {
let Some(body) = r.clean_summary else {
error!("clean_summary missing for {}", r.link);
return None;
};
let id = r.id;
Some(update_search_summary(pool, cacher, r.link, body, id))
})
.collect();
while let Some(res) = unordered.next().await {
//let res = res;
match res {
Ok(()) => {}
Err(err) => {
info!("failed refresh {err:?}");
// TODO:
//fd.error = Some(err);
}
};
}
Ok(true)
}
fn slurp_contents_selectors() -> HashMap<String, Vec<Selector>> {
hashmap![
"atmeta.com".to_string() => vec![
Selector::parse("div.entry-content").unwrap(),
],
"blog.prusa3d.com".to_string() => vec![
Selector::parse("article.content .post-block").unwrap(),
],
"blog.cloudflare.com".to_string() => vec![
Selector::parse(".author-lists .author-name-tooltip").unwrap(),
Selector::parse(".post-full-content").unwrap()
],
"blog.zsa.io".to_string() => vec![
Selector::parse("section.blog-article").unwrap(),
],
"engineering.fb.com".to_string() => vec![
Selector::parse("article").unwrap(),
],
"grafana.com".to_string() => vec![
Selector::parse(".blog-content").unwrap(),
],
"hackaday.com".to_string() => vec![
Selector::parse("div.entry-featured-image").unwrap(),
Selector::parse("div.entry-content").unwrap()
],
"ingowald.blog".to_string() => vec![
Selector::parse("article").unwrap(),
],
"jvns.ca".to_string() => vec![
Selector::parse("article").unwrap(),
],
"mitchellh.com".to_string() => vec![Selector::parse("div.w-full").unwrap()],
"natwelch.com".to_string() => vec![
Selector::parse("article div.prose").unwrap(),
],
"seiya.me".to_string() => vec![
Selector::parse("header + div").unwrap(),
],
"rustacean-station.org".to_string() => vec![
Selector::parse("article").unwrap(),
],
"slashdot.org".to_string() => vec![
Selector::parse("span.story-byline").unwrap(),
Selector::parse("div.p").unwrap(),
],
"theonion.com".to_string() => vec![
// Single image joke w/ title
Selector::parse("article > section > div > figure").unwrap(),
// Single cartoon
Selector::parse("article > div > div > figure").unwrap(),
// Image at top of article
Selector::parse("article > header > div > div > figure").unwrap(),
// Article body
Selector::parse("article .entry-content > *").unwrap(),
],
"trofi.github.io".to_string() => vec![
Selector::parse("#content").unwrap(),
],
"www.redox-os.org".to_string() => vec![
Selector::parse("div.content").unwrap(),
],
"www.smbc-comics.com".to_string() => vec![
Selector::parse("img#cc-comic").unwrap(),
Selector::parse("div#aftercomic img").unwrap(),
],
]
}

View File

@@ -1,13 +1,817 @@
use shared::Message;
use std::{
collections::{HashMap, HashSet},
fs::File,
io::{Cursor, Read},
};
use crate::error;
use letterbox_notmuch::Notmuch;
use letterbox_shared::{compute_color, Rule};
use mailparse::{parse_mail, MailHeader, MailHeaderMap};
use memmap::MmapOptions;
use sqlx::{types::Json, PgPool};
use tracing::{error, info, info_span, instrument, warn};
use zip::ZipArchive;
use crate::{
compute_offset_limit,
email_extract::*,
error::ServerError,
graphql::{
Attachment, Body, Corpus, EmailThread, Header, Html, Message, PlainText, Tag, Thread,
ThreadSummary, UnhandledContentType,
},
linkify_html, InlineStyle, Query, SanitizeHtml, Transformer,
};
const APPLICATION_GZIP: &'static str = "application/gzip";
const APPLICATION_ZIP: &'static str = "application/zip";
const MULTIPART_REPORT: &'static str = "multipart/report";
const MAX_RAW_MESSAGE_SIZE: usize = 100_000;
fn is_notmuch_query(query: &Query) -> bool {
query.is_notmuch || query.corpus == Some(Corpus::Notmuch)
}
pub fn is_notmuch_thread_or_id(id: &str) -> bool {
id.starts_with("id:") || id.starts_with("thread:")
}
// TODO(wathiede): decide good error type
pub fn threadset_to_messages(
thread_set: notmuch::ThreadSet,
) -> Result<Vec<Message>, error::ServerError> {
thread_set: letterbox_notmuch::ThreadSet,
) -> Result<Vec<Message>, ServerError> {
for t in thread_set.0 {
for tn in t.0 {}
for _tn in t.0 {}
}
Ok(Vec::new())
}
#[instrument(name="nm::count", skip_all, fields(query=%query))]
pub async fn count(nm: &Notmuch, query: &Query) -> Result<usize, ServerError> {
if !is_notmuch_query(query) {
return Ok(0);
}
let query = query.to_notmuch();
Ok(nm.count(&query)?)
}
#[instrument(name="nm::search", skip_all, fields(query=%query))]
pub async fn search(
nm: &Notmuch,
after: Option<i32>,
before: Option<i32>,
first: Option<i32>,
last: Option<i32>,
query: &Query,
) -> Result<Vec<(i32, ThreadSummary)>, async_graphql::Error> {
if !is_notmuch_query(query) {
return Ok(Vec::new());
}
let query = query.to_notmuch();
let (offset, mut limit) = compute_offset_limit(after, before, first, last);
if before.is_none() {
// When searching forward, the +1 is to see if there are more pages of data available.
// Searching backwards implies there's more pages forward, because the value represented by
// `before` is on the next page.
limit = limit + 1;
}
Ok(nm
.search(&query, offset as usize, limit as usize)?
.0
.into_iter()
.enumerate()
.map(|(i, ts)| {
(
offset + i as i32,
ThreadSummary {
thread: format!("thread:{}", ts.thread),
timestamp: ts.timestamp,
date_relative: ts.date_relative,
matched: ts.matched,
total: ts.total,
authors: ts.authors,
subject: ts.subject,
tags: ts.tags,
corpus: Corpus::Notmuch,
},
)
})
.collect())
}
#[instrument(name="nm::tags", skip_all, fields(needs_unread=needs_unread))]
pub fn tags(nm: &Notmuch, needs_unread: bool) -> Result<Vec<Tag>, ServerError> {
let unread_msg_cnt: HashMap<String, usize> = if needs_unread {
// 10000 is an arbitrary number, if there's more than 10k unread messages, we'll
// get an inaccurate count.
nm.search("is:unread", 0, 10000)?
.0
.iter()
.fold(HashMap::new(), |mut m, ts| {
ts.tags.iter().for_each(|t| {
m.entry(t.clone()).and_modify(|c| *c += 1).or_insert(1);
});
m
})
} else {
HashMap::new()
};
let tags: Vec<_> = nm
.tags()?
.into_iter()
.map(|tag| {
let hex = compute_color(&tag);
let unread = if needs_unread {
*unread_msg_cnt.get(&tag).unwrap_or(&0)
} else {
0
};
Tag {
name: tag,
fg_color: "white".to_string(),
bg_color: hex,
unread,
}
})
.chain(
nm.unread_recipients()?
.into_iter()
.filter_map(|(name, unread)| {
let Some(idx) = name.find('@') else {
return None;
};
let name = format!("{}/{}", &name[idx..], &name[..idx]);
let bg_color = compute_color(&name);
Some(Tag {
name,
fg_color: "white".to_string(),
bg_color,
unread,
})
}),
)
.collect();
Ok(tags)
}
#[instrument(name="nm::thread", skip_all, fields(thread_id=thread_id))]
pub async fn thread(
nm: &Notmuch,
pool: &PgPool,
thread_id: String,
debug_content_tree: bool,
) -> Result<Thread, ServerError> {
// TODO(wathiede): normalize all email addresses through an address book with preferred
// display names (that default to the most commonly seen name).
let mut messages = Vec::new();
for (path, id) in std::iter::zip(nm.files(&thread_id)?, nm.message_ids(&thread_id)?) {
let mut html_report_summary: Option<String> = None;
let tags = nm.tags_for_query(&format!("id:{}", id))?;
let file = File::open(&path)?;
let mmap = unsafe { MmapOptions::new().map(&file)? };
let m = parse_mail(&mmap)?;
let from = email_addresses(&path, &m, "from")?;
let mut from = match from.len() {
0 => None,
1 => from.into_iter().next(),
_ => {
warn!(
"Got {} from addresses in message, truncating: {:?}",
from.len(),
from
);
from.into_iter().next()
}
};
match from.as_mut() {
Some(from) => {
if let Some(addr) = from.addr.as_mut() {
let photo_url = photo_url_for_email_address(&pool, &addr).await?;
from.photo_url = photo_url;
}
}
_ => (),
}
let to = email_addresses(&path, &m, "to")?;
let cc = email_addresses(&path, &m, "cc")?;
let delivered_to = email_addresses(&path, &m, "delivered-to")?.pop();
let x_original_to = email_addresses(&path, &m, "x-original-to")?.pop();
let subject = m.headers.get_first_value("subject");
let timestamp = m
.headers
.get_first_value("date")
.and_then(|d| mailparse::dateparse(&d).ok());
let cid_prefix = letterbox_shared::urls::cid_prefix(None, &id);
let base_url = None;
let mut part_addr = Vec::new();
part_addr.push(id.to_string());
let body = match extract_body(&m, &mut part_addr)? {
Body::PlainText(PlainText { text, content_tree }) => {
let text = if text.len() > MAX_RAW_MESSAGE_SIZE {
format!(
"{}...\n\nMESSAGE WAS TRUNCATED @ {} bytes",
&text[..MAX_RAW_MESSAGE_SIZE],
MAX_RAW_MESSAGE_SIZE
)
} else {
text
};
Body::Html(Html {
html: {
let body_tranformers: Vec<Box<dyn Transformer>> = vec![
Box::new(InlineStyle),
Box::new(SanitizeHtml {
cid_prefix: &cid_prefix,
base_url: &base_url,
}),
];
let mut html = linkify_html(&text.trim_matches('\n'));
for t in body_tranformers.iter() {
if t.should_run(&None, &html) {
html = t.transform(&None, &html).await?;
}
}
format!(
r#"<p class="view-part-text-plain font-mono whitespace-pre-line">{}</p>"#,
// Trim newlines to prevent excessive white space at the beginning/end of
// presenation. Leave tabs and spaces incase plain text attempts to center a
// header on the first line.
html
)
},
content_tree: if debug_content_tree {
render_content_type_tree(&m)
} else {
content_tree
},
})
}
Body::Html(Html {
mut html,
content_tree,
}) => Body::Html(Html {
html: {
let body_tranformers: Vec<Box<dyn Transformer>> = vec![
// TODO: this breaks things like emails from calendar
//Box::new(InlineStyle),
Box::new(SanitizeHtml {
cid_prefix: &cid_prefix,
base_url: &base_url,
}),
];
for t in body_tranformers.iter() {
if t.should_run(&None, &html) {
html = t.transform(&None, &html).await?;
}
}
html
},
content_tree: if debug_content_tree {
render_content_type_tree(&m)
} else {
content_tree
},
}),
Body::UnhandledContentType(UnhandledContentType { content_tree, .. }) => {
let body_start = mmap
.windows(2)
.take(20_000)
.position(|w| w == b"\n\n")
.unwrap_or(0);
let body = mmap[body_start + 2..].to_vec();
Body::UnhandledContentType(UnhandledContentType {
text: String::from_utf8(body)?,
content_tree: if debug_content_tree {
render_content_type_tree(&m)
} else {
content_tree
},
})
}
};
let headers = m
.headers
.iter()
.map(|h| Header {
key: h.get_key(),
value: h.get_value(),
})
.collect();
// TODO(wathiede): parse message and fill out attachments
let attachments = extract_attachments(&m, &id)?;
let mut final_body = body;
let mut raw_report_content: Option<String> = None;
// Append TLS report if available
if m.ctype.mimetype.as_str() == MULTIPART_REPORT {
if let Ok(Body::Html(_html_body)) = extract_report(&m, &mut part_addr) {
// Extract raw JSON for pretty printing
if let Some(sp) = m
.subparts
.iter()
.find(|sp| sp.ctype.mimetype.as_str() == "application/tlsrpt+gzip")
{
if let Ok(gz_bytes) = sp.get_body_raw() {
let mut decoder = flate2::read::GzDecoder::new(&gz_bytes[..]);
let mut buffer = Vec::new();
if decoder.read_to_end(&mut buffer).is_ok() {
if let Ok(json_str) = String::from_utf8(buffer) {
raw_report_content = Some(json_str);
}
}
}
}
}
}
// Append DMARC report if available
if m.ctype.mimetype.as_str() == APPLICATION_ZIP {
if let Ok(Body::Html(html_body)) = extract_zip(&m) {
html_report_summary = Some(html_body.html);
// Extract raw XML for pretty printing
if let Ok(zip_bytes) = m.get_body_raw() {
if let Ok(mut archive) = ZipArchive::new(Cursor::new(&zip_bytes)) {
for i in 0..archive.len() {
if let Ok(mut file) = archive.by_index(i) {
let name = file.name().to_lowercase();
if is_dmarc_report_filename(&name) {
let mut xml = String::new();
use std::io::Read;
if file.read_to_string(&mut xml).is_ok() {
raw_report_content = Some(xml);
}
}
}
}
}
}
}
}
if m.ctype.mimetype.as_str() == APPLICATION_GZIP {
// Call extract_gzip to get the HTML summary and also to determine if it's a DMARC report
if let Ok((Body::Html(html_body), _)) = extract_gzip(&m) {
html_report_summary = Some(html_body.html);
// If extract_gzip successfully parsed a DMARC report, then extract the raw content
if let Ok(gz_bytes) = m.get_body_raw() {
let mut decoder = flate2::read::GzDecoder::new(&gz_bytes[..]);
let mut xml = String::new();
use std::io::Read;
if decoder.read_to_string(&mut xml).is_ok() {
raw_report_content = Some(xml);
}
}
}
}
let mut current_html = final_body.to_html().unwrap_or_default();
if let Some(html_summary) = html_report_summary {
current_html.push_str(&html_summary);
}
if let Some(raw_content) = raw_report_content {
let pretty_printed_content = if m.ctype.mimetype.as_str() == MULTIPART_REPORT {
// Pretty print JSON
if let Ok(parsed_json) = serde_json::from_str::<serde_json::Value>(&raw_content) {
serde_json::to_string_pretty(&parsed_json).unwrap_or(raw_content)
} else {
raw_content
}
} else {
// DMARC reports are XML
// Pretty print XML
match pretty_print_xml_with_trimming(&raw_content) {
Ok(pretty_xml) => pretty_xml,
Err(e) => {
error!("Failed to pretty print XML: {:?}", e);
raw_content
}
}
};
current_html.push_str(&format!(
"\n<pre>{}</pre>",
html_escape::encode_text(&pretty_printed_content)
));
}
final_body = Body::Html(Html {
html: current_html,
content_tree: final_body.to_html_content_tree().unwrap_or_default(),
});
messages.push(Message {
id: format!("id:{}", id),
from,
to,
cc,
subject,
tags,
timestamp,
headers,
body: final_body,
path,
attachments,
delivered_to,
x_original_to,
});
}
messages.reverse();
// Find the first subject that's set. After reversing the vec, this should be the oldest
// message.
let subject: String = messages
.iter()
.skip_while(|m| m.subject.is_none())
.next()
.and_then(|m| m.subject.clone())
.unwrap_or("(NO SUBJECT)".to_string());
Ok(Thread::Email(EmailThread {
thread_id,
subject,
messages,
}))
}
pub fn cid_attachment_bytes(nm: &Notmuch, id: &str, cid: &str) -> Result<Attachment, ServerError> {
let files = nm.files(id)?;
let Some(path) = files.first() else {
warn!("failed to find files for message {}", id);
return Err(ServerError::PartNotFound);
};
let file = File::open(&path)?;
let mmap = unsafe { MmapOptions::new().map(&file)? };
let m = parse_mail(&mmap)?;
if let Some(attachment) = walk_attachments(&m, |sp, _cur_idx| {
info!("{} {:?}", cid, get_content_id(&sp.headers));
if let Some(h_cid) = get_content_id(&sp.headers) {
let h_cid = &h_cid[1..h_cid.len() - 1];
if h_cid == cid {
let attachment = extract_attachment(&sp, id, &[]).unwrap_or(Attachment {
..Attachment::default()
});
return Some(attachment);
}
}
None
}) {
return Ok(attachment);
}
Err(ServerError::PartNotFound)
}
pub fn attachment_bytes(nm: &Notmuch, id: &str, idx: &[usize]) -> Result<Attachment, ServerError> {
let files = nm.files(id)?;
let Some(path) = files.first() else {
warn!("failed to find files for message {}", id);
return Err(ServerError::PartNotFound);
};
let file = File::open(&path)?;
let mmap = unsafe { MmapOptions::new().map(&file)? };
let m = parse_mail(&mmap)?;
if idx.is_empty() {
let Some(attachment) = extract_attachment(&m, id, &[]) else {
return Err(ServerError::PartNotFound);
};
return Ok(attachment);
}
if let Some(attachment) = walk_attachments(&m, |sp, cur_idx| {
if cur_idx == idx {
let attachment = extract_attachment(&sp, id, idx).unwrap_or(Attachment {
..Attachment::default()
});
return Some(attachment);
}
None
}) {
return Ok(attachment);
}
Err(ServerError::PartNotFound)
}
#[instrument(name="nm::set_read_status", skip_all, fields(query=%query, unread=unread))]
pub async fn set_read_status<'ctx>(
nm: &Notmuch,
query: &Query,
unread: bool,
) -> Result<bool, ServerError> {
let uids: Vec<_> = query
.uids
.iter()
.filter(|uid| is_notmuch_thread_or_id(uid))
.collect();
info!("set_read_status({} {:?})", unread, uids);
for uid in uids {
if unread {
nm.tag_add("unread", uid)?;
} else {
nm.tag_remove("unread", uid)?;
}
}
Ok(true)
}
async fn photo_url_for_email_address(
pool: &PgPool,
addr: &str,
) -> Result<Option<String>, ServerError> {
let row =
sqlx::query_as::<_, (String,)>(include_str!("../sql/photo_url_for_email_address.sql"))
.bind(addr)
.fetch_optional(pool)
.await?;
Ok(row.map(|r| r.0))
}
/*
* grab email_rules table from sql
* For each message with `unprocessed` label
* parse the message
* pass headers for each message through a matcher using email rules
* for each match, add label to message
* if any matches were found, remove unprocessed
* TODO: how to handle inbox label
*/
#[instrument(name="nm::label_unprocessed", skip_all, fields(dryrun=dryrun, limit=?limit, query=%query))]
pub async fn label_unprocessed(
nm: &Notmuch,
pool: &PgPool,
dryrun: bool,
limit: Option<usize>,
query: &str,
) -> Result<Box<[String]>, ServerError> {
use futures::StreamExt;
let ids = nm.message_ids(query)?;
info!(
"Processing {:?} of {} messages with '{}'",
limit,
ids.len(),
query
);
let rules: Vec<_> =
sqlx::query_as::<_, (Json<Rule>,)>(include_str!("../sql/label_unprocessed.sql"))
.fetch(pool)
.map(|r| r.unwrap().0 .0)
.collect()
.await;
/*
use letterbox_shared::{Match, MatchType};
let rules = vec![Rule {
stop_on_match: false,
matches: vec![Match {
match_type: MatchType::From,
needle: "eftours".to_string(),
}],
tag: "EFTours".to_string(),
}];
*/
info!("Loaded {} rules", rules.len());
let limit = limit.unwrap_or(ids.len());
let limit = limit.min(ids.len());
let ids = &ids[..limit];
let mut add_mutations = HashMap::new();
let mut rm_mutations = HashMap::new();
for id in ids {
let id = format!("id:{}", id);
let files = nm.files(&id)?;
// Only process the first file path is multiple files have the same id
let Some(path) = files.iter().next() else {
error!("No files for message-ID {}", id);
let t = "Letterbox/Bad";
nm.tag_add(t, &id)?;
let t = "unprocessed";
nm.tag_remove(t, &id)?;
continue;
};
let file = File::open(&path)?;
info!("parsing {}", path);
let mmap = unsafe { MmapOptions::new().map(&file)? };
let m = match info_span!("parse_mail", path = path).in_scope(|| parse_mail(&mmap)) {
Ok(m) => m,
Err(err) => {
error!("Failed to parse {}: {}", path, err);
let t = "Letterbox/Bad";
nm.tag_add(t, &id)?;
let t = "unprocessed";
nm.tag_remove(t, &id)?;
continue;
}
};
let (matched_rule, add_tags) = find_tags(&rules, &m.headers);
if matched_rule {
if dryrun {
info!(
"\nAdd tags: {:?}\nTo: {} From: {} Subject: {}\n",
add_tags,
m.headers.get_first_value("to").expect("no from header"),
m.headers.get_first_value("from").expect("no from header"),
m.headers
.get_first_value("subject")
.expect("no subject header")
);
}
for t in &add_tags {
//nm.tag_add(t, &id)?;
add_mutations
.entry(t.to_string())
.or_insert_with(|| Vec::new())
.push(id.clone());
}
if add_tags.contains("spam") || add_tags.contains("Spam") {
//nm.tag_remove("unread", &id)?;
let t = "unread".to_string();
rm_mutations
.entry(t)
.or_insert_with(|| Vec::new())
.push(id.clone());
}
if !add_tags.contains("inbox") {
//nm.tag_remove("inbox", &id)?;
let t = "inbox".to_string();
rm_mutations
.entry(t)
.or_insert_with(|| Vec::new())
.push(id.clone());
}
//nm.tag_remove("unprocessed", &id)?;
} else {
if add_tags.is_empty() {
let t = "Grey".to_string();
add_mutations
.entry(t)
.or_insert_with(|| Vec::new())
.push(id.clone());
}
//nm.tag_remove("inbox", &id)?;
let t = "inbox".to_string();
rm_mutations
.entry(t)
.or_insert_with(|| Vec::new())
.push(id.clone());
}
let t = "unprocessed".to_string();
rm_mutations
.entry(t)
.or_insert_with(|| Vec::new())
.push(id.clone());
}
info!("Adding {} distinct labels", add_mutations.len());
for (tag, ids) in add_mutations.iter() {
info!(" {}: {}", tag, ids.len());
if !dryrun {
let ids: Vec<_> = ids.iter().map(|s| s.as_str()).collect();
info_span!("tags_add", tag = tag, count = ids.len())
.in_scope(|| nm.tags_add(tag, &ids))?;
}
}
info!("Removing {} distinct labels", rm_mutations.len());
for (tag, ids) in rm_mutations.iter() {
info!(" {}: {}", tag, ids.len());
if !dryrun {
let ids: Vec<_> = ids.iter().map(|s| s.as_str()).collect();
info_span!("tags_remove", tag = tag, count = ids.len())
.in_scope(|| nm.tags_remove(tag, &ids))?;
}
}
Ok(ids.into())
}
fn find_tags<'a, 'b>(rules: &'a [Rule], headers: &'b [MailHeader]) -> (bool, HashSet<&'a str>) {
let mut matched_rule = false;
let mut add_tags = HashSet::new();
for rule in rules {
for hdr in headers {
if rule.is_match(&hdr.get_key(), &hdr.get_value()) {
//info!("Matched {:?}", rule);
matched_rule = true;
add_tags.insert(rule.tag.as_str());
if rule.stop_on_match {
return (true, add_tags);
}
}
}
}
(matched_rule, add_tags)
}
#[cfg(test)]
mod tests {
use super::*;
const REPORT_V1: &str = r#"
{
"organization-name": "Google Inc.",
"date-range": {
"start-datetime": "2025-08-09T00:00:00Z",
"end-datetime": "2025-08-09T23:59:59Z"
},
"contact-info": "smtp-tls-reporting@google.com",
"report-id": "2025-08-09T00:00:00Z_xinu.tv",
"policies": [
{
"policy": {
"policy-type": "sts",
"policy-string": [
"version: STSv1",
"mode: testing",
"mx: mail.xinu.tv",
"max_age: 86400"
],
"policy-domain": "xinu.tv"
},
"summary": {
"total-successful-session-count": 20,
"total-failure-session-count": 0
}
}
]
}
"#;
// The following constants are kept for future test expansion, but are currently unused.
/*
const REPORT_V2: &str = r#"
{
"organization-name": "Google Inc.",
"date-range": {
"start-datetime": "2025-08-09T00:00:00Z",
"end-datetime": "2025-08-09T23:59:59Z"
},
"contact-info": "smtp-tls-reporting@google.com",
"report-id": "2025-08-09T00:00:00Z_xinu.tv",
"policies": [
{
"policy": {
"policy-type": "sts",
"policy-string": [
"version: STSv1",
"mode": "testing",
"mx": "mail.xinu.tv",
"max_age": "86400"
],
"policy-domain": "xinu.tv",
"mx-host": [
"mail.xinu.tv"
]
},
"summary": {
"total-successful-session-count": 3,
"total-failure-session-count": 0
}
}
]
}
"#;
const REPORT_V3: &str = r#"
{
"organization-name": "Google Inc.",
"date-range": {
"start-datetime": "2025-08-09T00:00:00Z",
"end-datetime": "2025-08-09T23:59:59Z"
},
"contact-info": "smtp-tls-reporting@google.com",
"report-id": "2025-08-09T00:00:00Z_xinu.tv",
"policies": [
{
"policy": {
"policy-type": "sts",
"policy-string": [
"version: STSv1",
"mode": "testing",
"mx": "mail.xinu.tv",
"max_age": "86400"
],
"policy-domain": "xinu.tv",
"mx-host": [
{
"hostname": "mail.xinu.tv",
"failure-count": 0,
"result-type": "success"
}
]
},
"summary": {
"total-successful-session-count": 3,
"total-failure-session-count": 0
}
}
]
}
"#;
*/
#[test]
fn test_parse_tls_report_v1() {
let _report: TlsRpt = serde_json::from_str(REPORT_V1).unwrap();
}
}

353
server/src/tantivy.rs Normal file
View File

@@ -0,0 +1,353 @@
use std::collections::HashSet;
use log::{debug, error, info, warn};
use sqlx::{postgres::PgPool, types::time::PrimitiveDateTime};
use tantivy::{
collector::{DocSetCollector, TopDocs},
doc, query,
query::{AllQuery, BooleanQuery, Occur, QueryParser, TermQuery},
schema::{Facet, IndexRecordOption, Value},
DocAddress, Index, IndexReader, Searcher, TantivyDocument, TantivyError, Term,
};
use tracing::{info_span, instrument, Instrument};
use crate::{
compute_offset_limit,
error::ServerError,
graphql::{Corpus, ThreadSummary},
newsreader::{extract_thread_id, is_newsreader_thread},
thread_summary_from_row, Query, ThreadSummaryRecord,
};
pub fn is_tantivy_query(query: &Query) -> bool {
query.is_tantivy || query.corpus == Some(Corpus::Tantivy)
}
pub struct TantivyConnection {
db_path: String,
index: Index,
reader: IndexReader,
}
fn get_index(db_path: &str) -> Result<Index, TantivyError> {
Ok(match Index::open_in_dir(db_path) {
Ok(idx) => idx,
Err(err) => {
warn!("Failed to open {db_path}: {err}");
create_news_db(db_path)?;
Index::open_in_dir(db_path)?
}
})
}
impl TantivyConnection {
pub fn new(tantivy_db_path: &str) -> Result<TantivyConnection, TantivyError> {
let index = get_index(tantivy_db_path)?;
let reader = index.reader()?;
Ok(TantivyConnection {
db_path: tantivy_db_path.to_string(),
index,
reader,
})
}
#[instrument(name = "tantivy::refresh", skip_all)]
pub async fn refresh(&self, pool: &PgPool) -> Result<(), ServerError> {
let start_time = std::time::Instant::now();
let p_uids: Vec<_> = sqlx::query_file!("sql/all-uids.sql")
.fetch_all(pool)
.instrument(info_span!("postgres query"))
.await?
.into_iter()
.map(|r| r.uid)
.collect();
info!(
"refresh from postgres got {} uids in {}",
p_uids.len(),
start_time.elapsed().as_secs_f32()
);
let t_span = info_span!("tantivy query");
let _enter = t_span.enter();
let start_time = std::time::Instant::now();
let (searcher, _query) = self.searcher_and_query(&Query::default())?;
let docs = searcher.search(&AllQuery, &DocSetCollector)?;
let uid = self.index.schema().get_field("uid")?;
let t_uids: Vec<_> = docs
.into_iter()
.map(|doc_address| {
searcher
.doc(doc_address)
.map(|doc: TantivyDocument| {
debug!("doc: {doc:#?}");
doc.get_first(uid)
.expect("uid")
.as_str()
.expect("as_str")
.to_string()
})
.expect("searcher.doc")
})
.collect();
drop(_enter);
info!(
"refresh tantivy got {} uids in {}",
t_uids.len(),
start_time.elapsed().as_secs_f32()
);
let t_set: HashSet<_> = t_uids.into_iter().collect();
let need: Vec<_> = p_uids
.into_iter()
.filter(|uid| !t_set.contains(uid.as_str()))
.collect();
if !need.is_empty() {
info!(
"need to reindex {} uids: {:?}...",
need.len(),
&need[..need.len().min(10)]
);
}
let batch_size = 1000;
let uids: Vec<_> = need[..need.len().min(batch_size)]
.into_iter()
.cloned()
.collect();
self.reindex_uids(pool, &uids).await
}
#[instrument(skip(self, pool))]
async fn reindex_uids(&self, pool: &PgPool, uids: &[String]) -> Result<(), ServerError> {
if uids.is_empty() {
return Ok(());
}
// TODO: add SlurpContents and convert HTML to text
let pool: &PgPool = pool;
let mut index_writer = self.index.writer(50_000_000)?;
let schema = self.index.schema();
let site = schema.get_field("site")?;
let title = schema.get_field("title")?;
let summary = schema.get_field("summary")?;
let link = schema.get_field("link")?;
let date = schema.get_field("date")?;
let is_read = schema.get_field("is_read")?;
let uid = schema.get_field("uid")?;
let id = schema.get_field("id")?;
let tag = schema.get_field("tag")?;
info!("reindexing {} posts", uids.len());
let rows = sqlx::query_file_as!(PostgresDoc, "sql/posts-from-uids.sql", uids)
.fetch_all(pool)
.await?;
if uids.len() != rows.len() {
error!(
"Had {} uids and only got {} rows: uids {uids:?}",
uids.len(),
rows.len()
);
}
for r in rows {
let id_term = Term::from_field_text(uid, &r.uid);
index_writer.delete_term(id_term);
let slug = r.site;
let tag_facet = Facet::from(&format!("/News/{slug}"));
index_writer.add_document(doc!(
site => slug.clone(),
title => r.title,
// TODO: clean and extract text from HTML
summary => r.summary,
link => r.link,
date => tantivy::DateTime::from_primitive(r.date),
is_read => r.is_read,
uid => r.uid,
id => r.id as u64,
tag => tag_facet,
))?;
}
info_span!("IndexWriter.commit").in_scope(|| index_writer.commit())?;
info_span!("IndexReader.reload").in_scope(|| self.reader.reload())?;
Ok(())
}
#[instrument(name = "tantivy::reindex_thread", skip_all, fields(query=%query))]
pub async fn reindex_thread(&self, pool: &PgPool, query: &Query) -> Result<(), ServerError> {
let uids: Vec<_> = query
.uids
.iter()
.filter(|uid| is_newsreader_thread(uid))
.map(|uid| extract_thread_id(uid).to_string())
.collect();
Ok(self.reindex_uids(pool, &uids).await?)
}
#[instrument(name = "tantivy::reindex_all", skip_all)]
pub async fn reindex_all(&self, pool: &PgPool) -> Result<(), ServerError> {
let rows = sqlx::query_file!("sql/all-posts.sql")
.fetch_all(pool)
.await?;
let uids: Vec<String> = rows.into_iter().map(|r| r.uid).collect();
self.reindex_uids(pool, &uids).await?;
Ok(())
}
fn searcher_and_query(
&self,
query: &Query,
) -> Result<(Searcher, Box<dyn query::Query>), ServerError> {
// TODO: only create one reader
// From https://tantivy-search.github.io/examples/basic_search.html
// "For a search server you will typically create one reader for the entire lifetime of
// your program, and acquire a new searcher for every single request."
//
// I think there's some challenge in making the reader work if we reindex, so reader my
// need to be stored indirectly, and be recreated on reindex
// I think creating a reader takes 200-300 ms.
let schema = self.index.schema();
let searcher = self.reader.searcher();
let title = schema.get_field("title")?;
let summary = schema.get_field("summary")?;
let query_parser = QueryParser::for_index(&self.index, vec![title, summary]);
// Tantivy uses '*' to match all docs, not empty string
let term = &query.remainder.join(" ");
let term = if term.is_empty() { "*" } else { term };
info!("query_parser('{term}')");
let tantivy_query = query_parser.parse_query(&term)?;
let tag = schema.get_field("tag")?;
let is_read = schema.get_field("is_read")?;
let mut terms = vec![(Occur::Must, tantivy_query)];
for t in &query.tags {
let facet = Facet::from(&format!("/{t}"));
let facet_term = Term::from_facet(tag, &facet);
let facet_term_query = Box::new(TermQuery::new(facet_term, IndexRecordOption::Basic));
terms.push((Occur::Must, facet_term_query));
}
if query.unread_only {
info!("searching for unread only");
let term = Term::from_field_bool(is_read, false);
terms.push((
Occur::Must,
Box::new(TermQuery::new(term, IndexRecordOption::Basic)),
));
}
let search_query = BooleanQuery::new(terms);
Ok((searcher, Box::new(search_query)))
}
#[instrument(name="tantivy::count", skip_all, fields(query=%query))]
pub async fn count(&self, query: &Query) -> Result<usize, ServerError> {
if !is_tantivy_query(query) {
return Ok(0);
}
info!("tantivy::count {query:?}");
use tantivy::collector::Count;
let (searcher, query) = self.searcher_and_query(&query)?;
Ok(searcher.search(&query, &Count)?)
}
#[instrument(name="tantivy::search", skip_all, fields(query=%query))]
pub async fn search(
&self,
pool: &PgPool,
after: Option<i32>,
before: Option<i32>,
first: Option<i32>,
last: Option<i32>,
query: &Query,
) -> Result<Vec<(i32, ThreadSummary)>, async_graphql::Error> {
if !is_tantivy_query(query) {
return Ok(Vec::new());
}
let (offset, mut limit) = compute_offset_limit(after, before, first, last);
if before.is_none() {
// When searching forward, the +1 is to see if there are more pages of data available.
// Searching backwards implies there's more pages forward, because the value represented by
// `before` is on the next page.
limit = limit + 1;
}
let (searcher, search_query) = self.searcher_and_query(&query)?;
info!("Tantivy::search(query '{query:?}', off {offset}, lim {limit}, search_query {search_query:?})");
let top_docs = searcher.search(
&search_query,
&TopDocs::with_limit(limit as usize)
.and_offset(offset as usize)
.order_by_u64_field("date", tantivy::index::Order::Desc),
)?;
info!("search found {} docs", top_docs.len());
let uid = self.index.schema().get_field("uid")?;
let uids = top_docs
.into_iter()
.map(|(_, doc_address): (u64, DocAddress)| {
searcher.doc(doc_address).map(|doc: TantivyDocument| {
debug!("doc: {doc:#?}");
doc.get_first(uid)
.expect("doc missing uid")
.as_str()
.expect("doc str missing")
.to_string()
})
})
.collect::<Result<Vec<String>, TantivyError>>()?;
//let uids = format!("'{}'", uids.join("','"));
info!("uids {uids:?}");
let rows = sqlx::query_file!("sql/threads-from-uid.sql", &uids as &[String])
.fetch_all(pool)
.await?;
let mut res = Vec::new();
info!("found {} hits joining w/ tantivy", rows.len());
for (i, r) in rows.into_iter().enumerate() {
res.push((
i as i32 + offset,
thread_summary_from_row(ThreadSummaryRecord {
site: r.site,
date: r.date,
is_read: r.is_read,
title: r.title,
uid: r.uid,
name: r.name,
corpus: Corpus::Tantivy,
})
.await,
));
}
Ok(res)
}
pub fn drop_and_load_index(&self) -> Result<(), TantivyError> {
create_news_db(&self.db_path)
}
}
fn create_news_db(tantivy_db_path: &str) -> Result<(), TantivyError> {
info!("create_news_db");
// Don't care if directory didn't exist
let _ = std::fs::remove_dir_all(tantivy_db_path);
std::fs::create_dir_all(tantivy_db_path)?;
use tantivy::schema::*;
let mut schema_builder = Schema::builder();
schema_builder.add_text_field("site", STRING | STORED);
schema_builder.add_text_field("title", TEXT | STORED);
schema_builder.add_text_field("summary", TEXT);
schema_builder.add_text_field("link", STRING | STORED);
schema_builder.add_date_field("date", FAST | INDEXED | STORED);
schema_builder.add_bool_field("is_read", FAST | INDEXED | STORED);
schema_builder.add_text_field("uid", STRING | STORED);
schema_builder.add_u64_field("id", FAST);
schema_builder.add_facet_field("tag", FacetOptions::default());
let schema = schema_builder.build();
Index::create_in_dir(tantivy_db_path, schema)?;
Ok(())
}
struct PostgresDoc {
site: String,
title: String,
summary: String,
link: String,
date: PrimitiveDateTime,
is_read: bool,
uid: String,
id: i32,
}

35
server/src/ws.rs Normal file
View File

@@ -0,0 +1,35 @@
use std::{collections::HashMap, net::SocketAddr};
use axum::extract::ws::{Message, WebSocket};
use letterbox_shared::WebsocketMessage;
use tracing::{info, warn};
#[derive(Default)]
pub struct ConnectionTracker {
peers: HashMap<SocketAddr, WebSocket>,
}
impl ConnectionTracker {
pub async fn add_peer(&mut self, socket: WebSocket, who: SocketAddr) {
warn!("adding {who:?} to connection tracker");
self.peers.insert(who, socket);
self.send_message_all(WebsocketMessage::RefreshMessages)
.await;
}
pub async fn send_message_all(&mut self, msg: WebsocketMessage) {
info!("send_message_all {msg}");
let m = serde_json::to_string(&msg).expect("failed to json encode WebsocketMessage");
let mut bad_peers = Vec::new();
for (who, socket) in &mut self.peers.iter_mut() {
if let Err(e) = socket.send(Message::Text(m.clone().into())).await {
warn!("{:?} is bad, scheduling for removal: {e}", who);
bad_peers.push(who.clone());
}
}
for b in bad_peers {
info!("removing bad peer {b:?}");
self.peers.remove(&b);
}
}
}

View File

@@ -0,0 +1,59 @@
<!DOCTYPE html>
<html>
<head>
<meta charset=utf-8 />
<meta name="viewport" content="user-scalable=no, initial-scale=1.0, minimum-scale=1.0, maximum-scale=1.0, minimal-ui">
<title>GraphQL Playground</title>
<link rel="stylesheet" href="//cdn.jsdelivr.net/npm/graphql-playground-react/build/static/css/index.css" />
<link rel="shortcut icon" href="//cdn.jsdelivr.net/npm/graphql-playground-react/build/favicon.png" />
<script src="//cdn.jsdelivr.net/npm/graphql-playground-react/build/static/js/middleware.js"></script>
</head>
<body>
<div id="root">
<style>
body {
background-color: rgb(23, 42, 58);
font-family: Open Sans, sans-serif;
height: 90vh;
}
#root {
height: 100%;
width: 100%;
display: flex;
align-items: center;
justify-content: center;
}
.loading {
font-size: 32px;
font-weight: 200;
color: rgba(255, 255, 255, .6);
margin-left: 20px;
}
img {
width: 78px;
height: 78px;
}
.title {
font-weight: 400;
}
</style>
<img src='//cdn.jsdelivr.net/npm/graphql-playground-react/build/logo.png' alt=''>
<div class="loading"> Loading
<span class="title">GraphQL Playground</span>
</div>
</div>
<script>window.addEventListener('load', function (event) {
GraphQLPlayground.init(document.getElementById('root'), {
// options as 'endpoint' belong here
endpoint: "/api/graphql",
})
})</script>
</body>
</html>

42
server/static/vars.css Normal file
View File

@@ -0,0 +1,42 @@
:root {
--active-brightness: 0.85;
--border-radius: 5px;
--box-shadow: 2px 2px 10px;
--color-accent: #118bee15;
--color-bg: #fff;
--color-bg-secondary: #e9e9e9;
--color-link: #118bee;
--color-secondary: #920de9;
--color-secondary-accent: #920de90b;
--color-shadow: #f4f4f4;
--color-table: #118bee;
--color-text: #000;
--color-text-secondary: #999;
--color-scrollbar: #cacae8;
--font-family: -apple-system, BlinkMacSystemFont, "Segoe UI", Roboto, Oxygen-Sans, Ubuntu, Cantarell, "Helvetica Neue", sans-serif;
--hover-brightness: 1.2;
--justify-important: center;
--justify-normal: left;
--line-height: 1.5;
/*
--width-card: 285px;
--width-card-medium: 460px;
--width-card-wide: 800px;
*/
--width-content: 1080px;
}
@media (prefers-color-scheme: dark) {
:root[color-mode="user"] {
--color-accent: #0097fc4f;
--color-bg: #333;
--color-bg-secondary: #555;
--color-link: #0097fc;
--color-secondary: #e20de9;
--color-secondary-accent: #e20de94f;
--color-shadow: #bbbbbb20;
--color-table: #0097fc;
--color-text: #f7f7f7;
--color-text-secondary: #aaa;
}
}

View File

@@ -0,0 +1,99 @@
<!DOCTYPE html>
<html>
<head>
<title>DMARC Report</title>
</head>
<body>
{% if report.report_metadata.is_some() %}
{% let meta = report.report_metadata.as_ref().unwrap() %}
<b>Reporter:</b> {{ meta.org_name }}<br>
<b>Contact:</b> {{ meta.email }}<br>
<b>Report ID:</b> {{ meta.report_id }}<br>
{% if meta.date_range.is_some() %}
{% let dr = meta.date_range.as_ref().unwrap() %}
<b>Date range:</b>
{{ dr.begin }}
to
{{ dr.end }}
<br>
{% endif %}
{% endif %}
{% if report.policy_published.is_some() %}
{% let pol = report.policy_published.as_ref().unwrap() %}
<b>Policy Published:</b>
<ul>
<li>Domain: {{ pol.domain }}</li>
<li>ADKIM: {{ pol.adkim }}</li>
<li>ASPF: {{ pol.aspf }}</li>
<li>Policy: {{ pol.p }}</li>
<li>Subdomain Policy: {{ pol.sp }}</li>
<li>Percent: {{ pol.pct }}</li>
</ul>
{% endif %}
{% if report.record.is_some() %}
<b>Records:</b>
<table style="border-collapse:collapse;width:100%;font-size:0.95em;">
<thead>
<tr style="background:#f0f0f0;">
<th style="border:1px solid #bbb;padding:4px 8px;">Source IP</th>
<th style="border:1px solid #bbb;padding:4px 8px;">Count</th>
<th style="border:1px solid #bbb;padding:4px 8px;">Header From</th>
{% if report.has_envelope_to %}
<th style="border:1px solid #bbb;padding:4px 8px;">Envelope To</th>
{% endif %}
<th style="border:1px solid #bbb;padding:4px 8px;">Disposition</th>
<th style="border:1px solid #bbb;padding:4px 8px;">DKIM</th>
<th style="border:1px solid #bbb;padding:4px 8px;">SPF</th>
<th style="border:1px solid #bbb;padding:4px 8px;">Auth Results</th>
</tr>
</thead>
<tbody>
{% for rec in report.record.as_ref().unwrap() %}
<tr>
<td style="border:1px solid #bbb;padding:4px 8px;">{{ rec.source_ip }}</td>
<td style="border:1px solid #bbb;padding:4px 8px;">{{ rec.count }}</td>
<td style="border:1px solid #bbb;padding:4px 8px;">{{ rec.header_from }}</td>
{% if report.has_envelope_to %}
<td style="border:1px solid #bbb;padding:4px 8px;">{{ rec.envelope_to }}</td>
{% endif %}
<td style="border:1px solid #bbb;padding:4px 8px;">{{ rec.disposition }}</td>
<td style="border:1px solid #bbb;padding:4px 8px;">{{ rec.dkim }}</td>
<td style="border:1px solid #bbb;padding:4px 8px;">{{ rec.spf }}</td>
<td style="border:1px solid #bbb;padding:4px 8px;">
{% if rec.auth_results.is_some() %}
{% let auth = rec.auth_results.as_ref().unwrap() %}
{% for dkimres in auth.dkim %}
<span style="white-space:nowrap;">
DKIM: domain=<b>{{ dkimres.domain }}</b>
selector=<b>{{ dkimres.selector }}</b>
result=<b>{{ dkimres.result }}</b>
</span><br>
{% endfor %}
{% for spfres in auth.spf %}
<span style="white-space:nowrap;">
SPF: domain=<b>{{ spfres.domain }}</b>
scope=<b>{{ spfres.scope }}</b>
result=<b>{{ spfres.result }}</b>
</span><br>
{% endfor %}
{% for reason in rec.reason %}
<span style="white-space:nowrap;">Reason: {{ reason }}</span><br>
{% endfor %}
{% endif %}
</td>
</tr>
{% endfor %}
</tbody>
</table>
{% endif %}
{% if report.report_metadata.is_none() && report.policy_published.is_none() && report.record.is_none() %}
<p>No DMARC summary found.</p>
{% endif %}
</body>
</html>

View File

@@ -0,0 +1,115 @@
<style>
.ical-flex {
display: flex;
flex-direction: row;
flex-wrap: wrap;
align-items: stretch;
gap: 0.5em;
max-width: 700px;
width: 100%;
}
.ical-flex .summary-block {
flex: 1 1 0%;
}
.ical-flex .calendar-block {
flex: none;
margin-left: auto;
}
@media (max-width: 599px) {
.ical-flex {
flex-direction: column;
}
.ical-flex>div.summary-block {
margin-bottom: 0.5em;
margin-left: 0;
}
.ical-flex>div.calendar-block {
margin-left: 0;
}
}
</style>
<div class="ical-flex">
<div class="summary-block"
style="background:#f7f7f7; border-radius:8px; box-shadow:0 2px 8px #bbb; padding:16px 18px; margin:0 0 8px 0; min-width:220px; max-width:700px; font-size:15px; color:#222;">
<div
style="display: flex; flex-direction: row; flex-wrap: wrap; align-items: flex-start; gap: 0.5em; width: 100%;">
<div style="flex: 1 1 220px; min-width: 180px;">
<div style="font-size:17px; font-weight:bold; margin-bottom:8px; color:#333;"><b>Summary:</b> {{ summary
}}</div>
<div style="margin-bottom:4px;"><b>Start:</b> {{ local_fmt_start }}</div>
<div style="margin-bottom:4px;"><b>End:</b> {{ local_fmt_end }}</div>
{% if !recurrence_display.is_empty() %}
<div style="margin-bottom:4px;">
<b>Repeats:</b> {{ recurrence_display }}
</div>
{% endif %}
{% if !organizer_cn.is_empty() %}
<div style="margin-bottom:4px;"><b>Organizer:</b> {{ organizer_cn }}</div>
{% elif !organizer.is_empty() %}
<div style="margin-bottom:4px;"><b>Organizer:</b> {{ organizer }}</div>
{% endif %}
</div>
{% if all_days.len() > 0 %}
<div class="calendar-block" style="flex: none; margin-left: auto; min-width: 180px;">
<table class="ical-month"
style="border-collapse:collapse; min-width:220px; background:#fff; box-shadow:0 2px 8px #bbb; font-size:14px; margin:0;">
<caption
style="caption-side:top; text-align:center; font-weight:bold; font-size:16px; padding-bottom:8px 0;">
{{ caption }}</caption>
<thead>
<tr>
{% for wd in ["Sun", "Mon", "Tue", "Wed", "Thu", "Fri", "Sat"] %}
<th
style="padding:4px 6px; border-bottom:1px solid #ccc; color:#666; font-weight:600; background:#f7f7f7">
{{ wd }}</th>
{% endfor %}
</tr>
</thead>
<tbody>
{% for week in all_days|batch(7) %}
<tr>
{% for day in week %}
{% if event_days.contains(day) && today.is_some() && today.unwrap() == day %}
<td
data-event-day="{{ day.format("%Y-%m-%d") }}"
style="background:#ffd700; color:#222; font-weight:bold; border:2px solid #2196f3; border-radius:4px; text-align:center; box-shadow:0 0 0 2px #2196f3;">
{{ day.day() }}
</td>
{% elif event_days.contains(day) %}
<td
data-event-day="{{ day.format("%Y-%m-%d") }}"
style="background:#ffd700; color:#222; font-weight:bold; border:1px solid #aaa; border-radius:4px; text-align:center;">
{{ day.day() }}
</td>
{% elif today.is_some() && today.unwrap() == day %}
<td
style="border:2px solid #2196f3; border-radius:4px; text-align:center; background:#e3f2fd; color:#222; box-shadow:0 0 0 2px #2196f3;">
{{ day.day() }}
</td>
{% else %}
<td style="border:1px solid #eee; text-align:center;background:#f7f7f7;color:#bbb;">
{{ day.day() }}
</td>
{% endif %}
{% endfor %}
</tr>
{% endfor %}
</tbody>
</table>
</div>
{% endif %}
</div>
</div>
</div>
{% if !description_paragraphs.is_empty() %}
<div style="max-width:700px; width:100%;">
{% for p in description_paragraphs %}
<p style="margin: 0 0 8px 0; color:#444;">{{ p }}</p>
{% endfor %}
</div>
{% endif %}

View File

@@ -0,0 +1,48 @@
<!DOCTYPE html>
<html>
<head>
<title>TLS Report</title>
</head>
<body>
<h3>TLS Report Summary:</h3>
<p>Organization: {{ report.organization_name }}</p>
<p>Date Range: {{ report.date_range.start_datetime }} to {{ report.date_range.end_datetime }}</p>
<p>Contact: {{ report.contact_info }}</p>
<p>Report ID: {{ report.report_id }}</p>
<h4><b>Policies:</b></h4>
{% for policy in report.policies %}
<h5><b>Policy Domain:</b> {{ policy.policy.policy_domain }}</h5>
<ul>
<li><b>Policy Type:</b> {{ policy.policy.policy_type }}</li>
<li><b>Policy String:</b> {{ policy.policy.policy_string | join(", ") }}</li>
<li><b>Successful Sessions:</b> {{ policy.summary.total_successful_session_count }}</li>
<li><b>Failed Sessions:</b> {{ policy.summary.total_failure_session_count }}</li>
</ul>
<ul>
{% for mx_host in policy.policy.mx_host %}
<li><b>Hostname:</b> {{ mx_host.hostname }}, <b>Failures:</b> {{ mx_host.failure_count }}, <b>Result:</b> {{
mx_host.result_type }}</li>
{% endfor %}
</ul>
<ul>
{% for detail in policy.failure_details %}
<li><b>Result:</b> {{ detail.result_type }}, <b>Sending IP:</b> {{ detail.sending_mta_ip }}, <b>Failed
Sessions:</b> {{ detail.failed_session_count }}
{% if detail.failure_reason_code != "" %}
(<b>Reason:</b> {{ detail.failure_reason_code }})
{% endif %}
</li>
(<b>Receiving IP:</b> {{ detail.receiving_ip }})
(<b>Receiving MX:</b> {{ detail.receiving_mx_hostname }})
(<b>Additional Info:</b> {{ detail.additional_info }})
{% endfor %}
</ul>
{% endfor %}
</body>
</html>

View File

@@ -0,0 +1,48 @@
<?xml version="1.0" encoding="UTF-8" ?>
<feedback>
<version>1.0</version>
<report_metadata>
<org_name>google.com</org_name>
<email>noreply-dmarc-support@google.com</email>
<extra_contact_info>https://support.google.com/a/answer/2466580</extra_contact_info>
<report_id>5142106658860834914</report_id>
<date_range>
<begin>1755302400</begin>
<end>1755388799</end>
</date_range>
</report_metadata>
<policy_published>
<domain>xinu.tv</domain>
<adkim>s</adkim>
<aspf>s</aspf>
<p>quarantine</p>
<sp>reject</sp>
<pct>100</pct>
<np>reject</np>
</policy_published>
<record>
<row>
<source_ip>74.207.253.222</source_ip>
<count>1</count>
<policy_evaluated>
<disposition>none</disposition>
<dkim>pass</dkim>
<spf>pass</spf>
</policy_evaluated>
</row>
<identifiers>
<header_from>xinu.tv</header_from>
</identifiers>
<auth_results>
<dkim>
<domain>xinu.tv</domain>
<result>pass</result>
<selector>mail</selector>
</dkim>
<spf>
<domain>xinu.tv</domain>
<result>pass</result>
</spf>
</auth_results>
</record>
</feedback>

78
server/testdata/dmarc-example.xml vendored Normal file
View File

@@ -0,0 +1,78 @@
<?xml version="1.0"?>
<feedback xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
<version>1.0</version>
<report_metadata>
<org_name>Outlook.com</org_name>
<email>dmarcreport@microsoft.com</email>
<report_id>e6c5a2ce6e074d7d8cd041a0d6f32a3d</report_id>
<date_range>
<begin>1755302400</begin>
<end>1755388800</end>
</date_range>
</report_metadata>
<policy_published>
<domain>xinu.tv</domain>
<adkim>s</adkim>
<aspf>s</aspf>
<p>quarantine</p>
<sp>reject</sp>
<pct>100</pct>
<fo>1</fo>
</policy_published>
<record>
<row>
<source_ip>74.207.253.222</source_ip>
<count>1</count>
<policy_evaluated>
<disposition>none</disposition>
<dkim>pass</dkim>
<spf>pass</spf>
</policy_evaluated>
</row>
<identifiers>
<envelope_to>msn.com</envelope_to>
<envelope_from>xinu.tv</envelope_from>
<header_from>xinu.tv</header_from>
</identifiers>
<auth_results>
<dkim>
<domain>xinu.tv</domain>
<selector>mail</selector>
<result>pass</result>
</dkim>
<spf>
<domain>xinu.tv</domain>
<scope>mfrom</scope>
<result>pass</result>
</spf>
</auth_results>
</record>
<record>
<row>
<source_ip>74.207.253.222</source_ip>
<count>1</count>
<policy_evaluated>
<disposition>none</disposition>
<dkim>pass</dkim>
<spf>pass</spf>
</policy_evaluated>
</row>
<identifiers>
<envelope_to>hotmail.com</envelope_to>
<envelope_from>xinu.tv</envelope_from>
<header_from>xinu.tv</header_from>
</identifiers>
<auth_results>
<dkim>
<domain>xinu.tv</domain>
<selector>mail</selector>
<result>pass</result>
</dkim>
<spf>
<domain>xinu.tv</domain>
<scope>mfrom</scope>
<result>pass</result>
</spf>
</auth_results>
</record>
</feedback>

View File

@@ -0,0 +1,167 @@
Return-Path: <couchmoney+caf_=gmail=xinu.tv@gmail.com>
Delivered-To: bill@xinu.tv
Received: from phx.xinu.tv [74.207.253.222]
by nixos-01.h.xinu.tv with IMAP (fetchmail-6.5.1)
for <wathiede@localhost> (single-drop); Mon, 25 Aug 2025 14:29:47 -0700 (PDT)
Received: from phx.xinu.tv
by phx.xinu.tv with LMTP
id TPD3E8vVrGjawyMAJR8clQ
(envelope-from <couchmoney+caf_=gmail=xinu.tv@gmail.com>)
for <bill@xinu.tv>; Mon, 25 Aug 2025 14:29:47 -0700
X-Original-To: gmail@xinu.tv
Received-SPF: Pass (mailfrom) identity=mailfrom; client-ip=2a00:1450:4864:20::12e; helo=mail-lf1-x12e.google.com; envelope-from=couchmoney+caf_=gmail=xinu.tv@gmail.com; receiver=xinu.tv
Authentication-Results: phx.xinu.tv;
dkim=pass (2048-bit key; unprotected) header.d=google.com header.i=@google.com header.a=rsa-sha256 header.s=20230601 header.b=4sz9KOqm
Received: from mail-lf1-x12e.google.com (mail-lf1-x12e.google.com [IPv6:2a00:1450:4864:20::12e])
by phx.xinu.tv (Postfix) with ESMTPS id 2F9058B007
for <gmail@xinu.tv>; Mon, 25 Aug 2025 14:29:45 -0700 (PDT)
Received: by mail-lf1-x12e.google.com with SMTP id 2adb3069b0e04-55f4969c95aso994593e87.0
for <gmail@xinu.tv>; Mon, 25 Aug 2025 14:29:45 -0700 (PDT)
ARC-Seal: i=2; a=rsa-sha256; t=1756157384; cv=pass;
d=google.com; s=arc-20240605;
b=Y2CP7y9twLnWB5v8iyzZCw0vp33wQBS0qzltdtzX2NIWFhHu6MEp2XH8cONssaGrEN
kyjXajT7uaEpn6G8H6/NB9v9Vo2yk5Lq2f+RhODMYoocYs9YY9NJI4ZxMph0UeMO6RkQ
m+HH0iIeC2Mzgj1Bzq4qFEwb397YIijoxx+1RxyA2D3cwSuZtERSvFOEkHqv9ziWxBcD
u3tvySEuzjyQFU6bxfkax6sZljSRGzfj0iZJAl/Fw5tUgrhndQ55O5RDe4NfPNj0cw/3
XDELzsnepBgnW8Jpqpnh7iK6XMFSf4sPQmyiMCMDNVYtmm6hYFNo3/dOpgaPn/ImRr8j
d9lw==
ARC-Message-Signature: i=2; a=rsa-sha256; c=relaxed/relaxed; d=google.com; s=arc-20240605;
h=to:from:subject:date:message-id:auto-submitted:reply-to
:mime-version:dkim-signature:delivered-to;
bh=RJDaNO07yMMdVMfY1VnSbfmQtoKb6bs6XzWwF6+91ZY=;
fh=xB02AmI2fnPF5rMnM90IwqQ6Il76V+xMgSnSW+E42fE=;
b=H7Ze4a8zoCYB77xcnUnFTogJ/utYS/USzTL/7eS3nA6OPbD+zWRiiVmbSfQcNK7d25
LapXyYnRJKgc8sqqQ6XO26STA8xx/9G620pdTytChIzKsmm/T5cdlf1M8DJ+NlwkzzSG
6Xe5I0MuXSKzBDMmcBcMlY9+mp61eZNo/cGT34MfZvLDS7JCs5uQYy2gRyajCKzRddEP
NBfMgnP1Ag9B5KkpJr4QfA2IWoNlj/qom/bRcdcdjwQ3gwDeiG8rdrEwBt9juwqk8d95
C0LnVKfrXAZgolmJpljyIFb1IMMyBUIQhK+7cXFhV1AD6Laz0df9gmPWp5mGZz9qlYaY
BqJA==;
darn=xinu.tv
ARC-Authentication-Results: i=2; mx.google.com;
dkim=pass header.i=@google.com header.s=20230601 header.b=4sz9KOqm;
spf=pass (google.com: domain of 3odssaaoscuanoeqnnkpiuugcvvnguejqqnu.qtieqwejoqpgaiockn.eqo@calendar-server.bounces.google.com designates 209.85.220.73 as permitted sender) smtp.mailfrom=3OdSsaAoSCuANOEQNNKPIUUGCVVNGUEJQQNU.QTIEQWEJOQPGaIOCKN.EQO@calendar-server.bounces.google.com;
dmarc=pass (p=REJECT sp=REJECT dis=NONE) header.from=google.com;
dara=pass header.i=@gmail.com
X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed;
d=1e100.net; s=20230601; t=1756157384; x=1756762184;
h=to:from:subject:date:message-id:auto-submitted:reply-to
:mime-version:dkim-signature:delivered-to:x-forwarded-for
:x-forwarded-to:x-gm-message-state:from:to:cc:subject:date
:message-id:reply-to;
bh=RJDaNO07yMMdVMfY1VnSbfmQtoKb6bs6XzWwF6+91ZY=;
b=m95okwnmqNvW4GhCfY8yZvCu5NxuhHCL2+A54SlIrRudednXK05YGzjZ5LOuCAaY1g
htpRv2cGHBj2mEnHh+3GIX5vQCmXw2ptzOGzfYe9TwavuKPkkKPiSD5wA1fk8quqHDOD
4XDM7dsn3xewJ+6GQyc6NPBQq53hmpAojbLXnmNtAIyfAvuxtHP1G+GSO+ZIApgg56K6
TaYrwqnRx66P8B2Ze111LCdnmOOLzweJ1muYyavPdCtTG5BbJgqzaI67bQhuUNZDhVbP
FdtT4Q7WzNt30JHCVIAkkHejD9Fh/mYSmETXpD+ISvZJ47DNnLP4RXjmmAWcHJkKsh+q
v3QQ==
X-Forwarded-Encrypted: i=2; AJvYcCUeIjyIxPoWuMqg9l5aomQv7Z9wLYkwDIS1FYz7bNmHs1Cs0CSHG8Y5B0iU/nlo9xRenTW/Xw==@xinu.tv
X-Gm-Message-State: AOJu0Yznjr5TC7UpZJk74jrsJzMBwx6/39s9e5ufIA5/FmHZ6I1bEdTc
vqpeeLdzSZTI2uZiR7zzKHiwmNJHt/LncR9kDR5f0I6b3MZuXpAgr0aKYdXw7B+b+h7D7uMM3Tm
JF9ccf09JxIzRzeRI9Vb52PUs4SIeiIU9J80QY53UqN/Rx8XMF+ncRSX5d4V4pQ==
X-Received: by 2002:a05:6512:110e:b0:55f:3bab:f204 with SMTP id 2adb3069b0e04-55f3babf35emr3087055e87.31.1756156987711;
Mon, 25 Aug 2025 14:23:07 -0700 (PDT)
X-Forwarded-To: gmail@xinu.tv
X-Forwarded-For: couchmoney@gmail.com gmail@xinu.tv
Delivered-To: couchmoney@gmail.com
Received: by 2002:a05:6504:6116:b0:2b8:eb6f:82ec with SMTP id i22csp44357ltt;
Mon, 25 Aug 2025 14:23:06 -0700 (PDT)
X-Received: by 2002:a05:6e02:164e:b0:3ed:94a6:2edb with SMTP id e9e14a558f8ab-3ed94a63097mr41416195ab.21.1756156986122;
Mon, 25 Aug 2025 14:23:06 -0700 (PDT)
ARC-Seal: i=1; a=rsa-sha256; t=1756156986; cv=none;
d=google.com; s=arc-20240605;
b=Nu0W/67J2nYqDAXf27QdfmUyuA6TGJwusKLaHRaE05YdEu/FWLfUk2ATV+g3iUQ19b
wh7awaA5kemxwiBqAy5kjjlXqlDrkK0Ow2fANdc6lRKvlRNJRYUnojMkP8w/v4Nv8YQj
Wci0HMhL4ni/yeqXeoaj1yKtwJU5MvRMxZZC7TinlCHKF5+MqgD8VNax8OTDOqxYvSDi
aIlyUBTial0AiP/K+3bsoIWEc2RoyBBBNIe88C4s1fcv17GCGn5RkN3lYtr+nwvp5wNE
fKxPCYMtXkNyv8jgjmgxKLcYBDK0B4Zo+ghMWXZneDWo3qotDVkr0GBC3J2N7BcZpjCA
XEDA==
ARC-Message-Signature: i=1; a=rsa-sha256; c=relaxed/relaxed; d=google.com; s=arc-20240605;
h=to:from:subject:date:message-id:auto-submitted:reply-to
:mime-version:dkim-signature;
bh=RJDaNO07yMMdVMfY1VnSbfmQtoKb6bs6XzWwF6+91ZY=;
fh=mbzrMIWIgWMC0ni1xEx+ViW4J0RLAdLdPT2cX81nTlk=;
b=NvhrlkKGEVx63UMsx510U8ePUo7OgRQBWxZ4BIpQWg6Fk0jJPaZgRoEpUdZ747et1P
rWTx/yVaEUHBqWtt0I4ktiD8Hr4cVqAwKvtiN32JpkGCsVBjYBWqxEalWIOg6abn8xLE
7x9j4GqD/cQhd3DiS6UtADsJ67MjjzLpGkskvxo67vKRGCfSLCKdbna2LO5TtoZ7fKO7
i+dhDol6IIgA2Sg+PZlzq6gbZTaFbglUNI7uOwz0fNWjhHH4ZfmPEycYxJ9bTuPISrqS
BkXxGQFkvlg42NHWt5L8aPzrx8OMoYfTniIqU19GeEFEVUbmzYCg/twZ0f5nxugHWDbD
PMvQ==;
dara=google.com
ARC-Authentication-Results: i=1; mx.google.com;
dkim=pass header.i=@google.com header.s=20230601 header.b=4sz9KOqm;
spf=pass (google.com: domain of 3odssaaoscuanoeqnnkpiuugcvvnguejqqnu.qtieqwejoqpgaiockn.eqo@calendar-server.bounces.google.com designates 209.85.220.73 as permitted sender) smtp.mailfrom=3OdSsaAoSCuANOEQNNKPIUUGCVVNGUEJQQNU.QTIEQWEJOQPGaIOCKN.EQO@calendar-server.bounces.google.com;
dmarc=pass (p=REJECT sp=REJECT dis=NONE) header.from=google.com;
dara=pass header.i=@gmail.com
Received: from mail-sor-f73.google.com (mail-sor-f73.google.com. [209.85.220.73])
by mx.google.com with SMTPS id ca18e2360f4ac-886c8fc41ebsor461233039f.7.2025.08.25.14.23.05
for <couchmoney@gmail.com>
(Google Transport Security);
Mon, 25 Aug 2025 14:23:06 -0700 (PDT)
Received-SPF: pass (google.com: domain of 3odssaaoscuanoeqnnkpiuugcvvnguejqqnu.qtieqwejoqpgaiockn.eqo@calendar-server.bounces.google.com designates 209.85.220.73 as permitted sender) client-ip=209.85.220.73;
Authentication-Results: mx.google.com;
dkim=pass header.i=@google.com header.s=20230601 header.b=4sz9KOqm;
spf=pass (google.com: domain of 3odssaaoscuanoeqnnkpiuugcvvnguejqqnu.qtieqwejoqpgaiockn.eqo@calendar-server.bounces.google.com designates 209.85.220.73 as permitted sender) smtp.mailfrom=3OdSsaAoSCuANOEQNNKPIUUGCVVNGUEJQQNU.QTIEQWEJOQPGaIOCKN.EQO@calendar-server.bounces.google.com;
dmarc=pass (p=REJECT sp=REJECT dis=NONE) header.from=google.com;
dara=pass header.i=@gmail.com
DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed;
d=google.com; s=20230601; t=1756156985; x=1756761785; dara=google.com;
h=to:from:subject:date:message-id:auto-submitted:reply-to
:mime-version:from:to:cc:subject:date:message-id:reply-to;
bh=RJDaNO07yMMdVMfY1VnSbfmQtoKb6bs6XzWwF6+91ZY=;
b=4sz9KOqmGGwObcaR0iSSMVeeMvZHqMzvY4cw++RddJd0V48WoyPPI5q1oMeGiVZ6fm
eEWVr8xH9/T1JUqUZXJHY6CPixN9nTpLvZlpikG1KOFv5+I5DNVX/O5i6M5C/yIPRVGv
ja0ygA7WTL48IkHV7+PTPwHmhF8zv1/BeNdko4BSywfql64J6NMM5RnOAejTIf5AR/IL
CW7H2IcmiOGBHfgMApQljg3wB+WgUel7RXZfMnHCbSlmynJ6bDJ4tq7uU16GLpnI6qAe
s9w8cOpFPiQk8uKEqdc682XxKlwqYdh07RWO/EdlZ8WeSoxMfU6YZL7c1s6xxK2c9sT7
8Xxg==
X-Google-Smtp-Source: AGHT+IFJwttd47Uo06h0EKkogFtVf4poWcHfmodh4dZqSviwYROSgnnyI2ZJSibXGnOUHiLIfAwFn6KP9CzXMoyncWSb
MIME-Version: 1.0
X-Received: by 2002:a05:6602:14c9:b0:884:47f0:b89f with SMTP id
ca18e2360f4ac-886bd0f2960mr1726062839f.3.1756156985586; Mon, 25 Aug 2025
14:23:05 -0700 (PDT)
Reply-To: tconvertino@gmail.com
Auto-Submitted: auto-generated
Message-ID: <calendar-43033c42-cc1e-4014-a5e8-c4552d41247e@google.com>
Date: Mon, 25 Aug 2025 21:23:05 +0000
Subject: New event: McClure BLT @ Monthly from 7:30am to 8:30am on the second
Thursday from Thu Sep 11 to Fri Jan 30, 2026 (PDT) (tconvertino@gmail.com)
From: "lmcollings@seattleschools.org (Google Calendar)" <calendar-notification@google.com>
To: couchmoney@gmail.com
Content-Type: multipart/alternative; boundary="0000000000004bc1be063d372904"
--0000000000004bc1be063d372904
Content-Type: text/plain; charset="UTF-8"; format=flowed; delsp=yes
Content-Transfer-Encoding: base64
TWNDbHVyZSBCTFQNCk1vbnRobHkgZnJvbSA3OjMwYW0gdG8gODozMGFtIG9uIHRoZSBzZWNvbmQg
VGh1cnNkYXkgZnJvbSBUaHVyc2RheSBTZXAgMTEgIA0KdG8gRnJpZGF5IEphbiAzMCwgMjAyNg0K
UGFjaWZpYyBUaW1lIC0gTG9zIEFuZ2VsZXMNCg0KTG9jYXRpb24NCk1jQ2x1cmUgTGlicmFyeQkN
Cmh0dHBzOi8vd3d3Lmdvb2dsZS5jb20vbWFwcy9zZWFyY2gvTWNDbHVyZStMaWJyYXJ5P2hsPWVu
DQoNCg0KDQpCTFQgd2lsbCBtZWV0IG9uIHRoZSAybmQgVGh1cnNkYXkgb2YgZXZlcnkgbW9udGgg
dW50aWwgSmFudWFyeSB3aGVuIHdlICANCmJlZ2luIGxvb2tpbmcgYXQgYnVkZ2V0LiBBZGRpdGlv
bmFsIG1lZXRpbmdzIG1heSBhbHNvIGJlIHNjaGVkdWxlZCBlYXJsaWVyICANCmlmIG5lZWRlZC4N
ClRoYW5rcywNCk1jQ2x1cmUgQkxUDQoNCg0KDQpPcmdhbml6ZXINCmxtY29sbGluZ3NAc2VhdHRs
ZXNjaG9vbHMub3JnDQpsbWNvbGxpbmdzQHNlYXR0bGVzY2hvb2xzLm9yZw0KDQpHdWVzdHMNCmxt
Y29sbGluZ3NAc2VhdHRsZXNjaG9vbHMub3JnIC0gb3JnYW5pemVyDQp0Y29udmVydGlub0BnbWFp
bC5jb20gLSBjcmVhdG9yDQptYW5kcy5hbmRydXNAZ21haWwuY29tDQphbXNjaHVtZXJAc2VhdHRs
ZXNjaG9vbHMub3JnDQphcGplbm5pbmdzQHNlYXR0bGVzY2hvbHMub3JnDQpsbWJsYXVAc2VhdHRs
ZXNjaG9vbHMub3JnDQptbmxhbmRpc0BzZWF0dGxlc2Nob29scy5vcmcNCnRtYnVyY2hhcmR0QHNl
YXR0bGVzY2hvb2xzLm9yZw0KbWNjbHVyZWFsbHN0YWZmQHNlYXR0bGVzY2hvbHMub3JnIC0gb3B0
aW9uYWwNClZpZXcgYWxsIGd1ZXN0IGluZm8gIA0KaHR0cHM6Ly9jYWxlbmRhci5nb29nbGUuY29t
L2NhbGVuZGFyL3I/ZWlkPVh6WXdjVE13WXpGbk5qQnZNekJsTVdrMk1HODBZV016WnpZd2NtbzRa
M0JzT0RoeWFqSmpNV2c0TkhNelpHZzVae1l3Y3pNd1l6Rm5OakJ2TXpCak1XYzNORG96T0dkb2Fq
WXhNR3RoWjNFeE5qUnhhemhuY0djMk5HOHpNR014WnpZd2J6TXdZekZuTmpCdk16QmpNV2MyTUc4
ek1tTXhaell3YnpNd1l6Rm5PR2R4TTJGalNXODNOSUF6YVdReGJUY3hNbXBqWkRGck5qVXhNamhq
TVcwM01USnFNbWRvYnpnMGN6TTJaSEJwTmprek1DQjBZMjl1ZG1WeWRHbHViMEJ0JmVzPTENCg0K
fn4vL35+DQpJbnZpdGF0aW9uIGZyb20gR29vZ2xlIENhbGVuZGFyOiBodHRwczovL2NhbGVuZGFy
Lmdvb2dsZS5jb20vY2FsZW5kYXIvDQoNCllvdSBhcmUgcmVjZWl2aW5nIHRoaXMgZW1haWwgYmVj
YXVzZSB5b3UgYXJlIHN1YnNjcmliZWQgdG8gY2FsZW5kYXIgIA0Kbm90aWZpY2F0aW9ucy4gVG8g
c3RvcCByZWNlaXZpbmcgdGhlc2UgZW1haWxzLCBnbyB0byAgDQpodHRwczovL2NhbGVuZGFyLmdv
b2dsZS5jb20vY2FsZW5kYXIvci9zZXR0aW5ncywgc2VsZWN0IHRoaXMgY2FsZW5kYXIsIGFuZCANCmNoYW5nZSAiT3RoZXIgbm90aWZpY2F0aW9ucyIuDQoNCkZvcndhcmRpbmcgdGhpcyBpbnZp
dGF0aW9uIGNvdWxkIGFsbG93IGFueSByZWNpcGllbnQgdG8gc2VuZCBhIHJlc3BvbnNlIHRvICAN
CnRoZSBvcmdhbml6ZXIsIGJlIGFkZGVkIHRvIHRoZSBndWVzdCBsaXN0LCBpbnZpdGUgb3RoZXJz
IHJlZ2FyZGxlc3Mgb2YgIA0KdGhlaXIgb3duIGludml0YXRpb24gc3RhdHVzLCBvciBtb2RpZnkg
eW91ciBSU1ZQLg0KDQpMZWFybiBtb3JlIGh0dHBzOi8vc3VwcG9ydC5nb29nbGUuY29tL2NhbGVu
ZGFyL2Fuc3dlci8zNzEzNSNmb3J3YXJkaW5nDQo=
--0000000000004bc1be063d372904--

View File

@@ -0,0 +1,206 @@
Return-Path: <couchmoney+caf_=gmail=xinu.tv@gmail.com>
Delivered-To: bill@xinu.tv
Received: from phx.xinu.tv [74.207.253.222]
by nixos-01.h.xinu.tv with IMAP (fetchmail-6.5.1)
for <wathiede@localhost> (single-drop); Thu, 28 Aug 2025 12:11:15 -0700 (PDT)
Received: from phx.xinu.tv
by phx.xinu.tv with LMTP
id 1gVrANOpsGg9TSQAJR8clQ
(envelope-from <couchmoney+caf_=gmail=xinu.tv@gmail.com>)
for <bill@xinu.tv>; Thu, 28 Aug 2025 12:11:15 -0700
X-Original-To: gmail@xinu.tv
Received-SPF: Pass (mailfrom) identity=mailfrom; client-ip=2a00:1450:4864:20::230; helo=mail-lj1-x230.google.com; envelope-from=couchmoney+caf_=gmail=xinu.tv@gmail.com; receiver=xinu.tv
Authentication-Results: phx.xinu.tv;
dkim=pass (2048-bit key; unprotected) header.d=google.com header.i=@google.com header.a=rsa-sha256 header.s=20230601 header.b=RjBRlfFL;
dkim=pass (2048-bit key; unprotected) header.d=gmail.com header.i=@gmail.com header.a=rsa-sha256 header.s=20230601 header.b=HaiL0lRL
Received: from mail-lj1-x230.google.com (mail-lj1-x230.google.com [IPv6:2a00:1450:4864:20::230])
by phx.xinu.tv (Postfix) with ESMTPS id B4E848B007
for <gmail@xinu.tv>; Thu, 28 Aug 2025 12:11:13 -0700 (PDT)
Received: by mail-lj1-x230.google.com with SMTP id 38308e7fff4ca-336a85b8fc5so8142611fa.3
for <gmail@xinu.tv>; Thu, 28 Aug 2025 12:11:13 -0700 (PDT)
ARC-Seal: i=2; a=rsa-sha256; t=1756408272; cv=pass;
d=google.com; s=arc-20240605;
b=Nq93fJSEgPuxWsaf3dc6cCKbOP/bXMQJfmuZJBvrid99GipahJY/Ka4SGoLc8HBMH2
Ip9YDLG2Lblqz/N1KOud9gnAmQ6Zg4hfPZGvhUfCGaXbCi2lOhRlfx6QM0lM1B8rAXaA
S3Lt2qFFXrVBlvaJePwI+wVpc1wPbvd5PblaaUTYUVJeYSfdPtgNAy0Aehty9TF0Jo2h
9yrzCWMJ6kMTpsDw7sfDSnv7s43Q3jOPzXDjHdJfrK8aUXGQenwT+1acJkIw78wBFt3R
IG5CBLIKmwDpjquJzRPkEjHiNDRxhaKaCShTCVLTjmrYgbHXPM/gUewaKLfeIuTzOVuA
mnkw==
ARC-Message-Signature: i=2; a=rsa-sha256; c=relaxed/relaxed; d=google.com; s=arc-20240605;
h=to:from:subject:date:message-id:auto-submitted:sender:reply-to
:mime-version:dkim-signature:dkim-signature:delivered-to;
bh=lgr/fFBrye/qM438Us9TAp1/DYWNuYxn2NUL4vzX/SU=;
fh=twOWSYT+4sbeBuT1oeA5xzauBIj0SLZH5qI1YanOQio=;
b=FBstDUezbqJRRRxTwlKY4UXNSJ4z9aZdvb9KOlxXfFLCzUh3r5w+9P4+a/uH1Uw65g
xbxzPRgMduPWgKDAweqXk9SGX3mjqF0oyd5yhGTiU/jpHg6ZLXf//g45zJqRjfMnRi8I
vbEEAxUKyhPfbrQ8/byfq/isJHFiR0Vjr2U0HOqcctRgCTfrZr1b14jRVopjVqhk37ef
KapCbmTbBLznJLQH6jfi4LvKpSlJDW6l7R/CC4WtAzgcmHyA9nfjM4+egLg15giMpn3a
549c+jYBFgsjblhmyFw05dGSpUvP+jJeKTcFnlZe6yU7Qjnqhs6TlV/Jm8HAkPH1zdS5
XDAw==;
darn=xinu.tv
ARC-Authentication-Results: i=2; mx.google.com;
dkim=pass header.i=@google.com header.s=20230601 header.b=RjBRlfFL;
dkim=pass header.i=@gmail.com header.s=20230601 header.b=HaiL0lRL;
spf=pass (google.com: domain of tconvertino@gmail.com designates 209.85.220.73 as permitted sender) smtp.mailfrom=tconvertino@gmail.com;
dmarc=pass (p=NONE sp=QUARANTINE dis=NONE) header.from=gmail.com;
dara=pass header.i=@gmail.com
X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed;
d=1e100.net; s=20230601; t=1756408272; x=1757013072;
h=to:from:subject:date:message-id:auto-submitted:sender:reply-to
:mime-version:dkim-signature:dkim-signature:delivered-to
:x-forwarded-for:x-forwarded-to:x-gm-message-state:from:to:cc
:subject:date:message-id:reply-to;
bh=lgr/fFBrye/qM438Us9TAp1/DYWNuYxn2NUL4vzX/SU=;
b=VJaqGIpPE1gxGhbAl1Np3yZR/0QPEs/C6KtFdnsaH9ubxFrDOeF4uIygqAUN9YFmll
YZsN4G0iexB097atKRIXLrreE3pH3cOY56ym94fWRZGythS0MRZlw40QoHLLf3joTC6D
WHtaNcea0hO3V6l/6gKlOffJ/cv2GnyPi0Sv7neOC5v18VTxZwZn+Wp+pTPpWFcmvQ4J
IMSV0vNgIRrYJaItUt1d59B9Ah+0bcyd7jJ0TDRVvN97S8iSlSIw6NMwxjZMuyJSWO7X
5zm8xA+H+L8+pLMmGKfdBYxhNo/ibdwda+w/ECKIjdnFtbreGbYLsUnkLdPeumQ6LXs/
Q2mg==
X-Forwarded-Encrypted: i=2; AJvYcCXpJ2X9EF2q2d4efhhe9B8o7LcuPPe25tZZwgkhfxerDzSbY0obB8Eik41xltO5i7k4ANaJKQ==@xinu.tv
X-Gm-Message-State: AOJu0Yz5+coY8ftW9IS5OD7ZbkwXnD43Mcp5BZjn5I2cv4v+u+ilxOi+
0DKABW1HVFh3MqQ/Z9nU+svpDl4kHa5lTr5siCXHTf0Wpo4LT3UsILyLUvwua0tsx9da14Gl6Fb
R1xVSmax6VR4PgZzrnOKZZx1x1re2RaTFGMAaA0Ei5ua3bZpn8axccwggYc94Jw==
X-Received: by 2002:a2e:a984:0:b0:336:7b24:2af7 with SMTP id 38308e7fff4ca-3367b242dd2mr36540291fa.17.1756408271464;
Thu, 28 Aug 2025 12:11:11 -0700 (PDT)
X-Forwarded-To: gmail@xinu.tv
X-Forwarded-For: couchmoney@gmail.com gmail@xinu.tv
Delivered-To: couchmoney@gmail.com
Received: by 2002:a05:6504:955:b0:2b8:eb6f:82ec with SMTP id k21csp1133490lts;
Thu, 28 Aug 2025 12:11:10 -0700 (PDT)
X-Received: by 2002:a05:6602:3c3:b0:86d:9ec7:267e with SMTP id ca18e2360f4ac-886bd155520mr3955796839f.4.1756408269941;
Thu, 28 Aug 2025 12:11:09 -0700 (PDT)
ARC-Seal: i=1; a=rsa-sha256; t=1756408269; cv=none;
d=google.com; s=arc-20240605;
b=Gvk+jquchLt+hySEph55datOhigiuAMXW4mgi5vTVp51rzJ7PB+rH7vx23tj1QAB+0
RIOZTaB67H8yFXwAUNZWd1GMnpocZR+tI4bMxbKzDYd7zgaTzSSa2InDROhqOhHqBpX8
eWD23F+xRon/qEYQd0YEjZVt20WvKzpvjbpvCyWpq7Z4y376KoJArxsspsKZlALrCfKq
cyt9B/EKr3ZmAzRiswiH7KY/iHd1qYgtYy0tYGNtjU0nZ+5fK/tVlw+lJuLtt+aA+ZCy
o5y8Y5/thdSJsT159u+bV5eICZWC5kGnztNsXg0Nr2H22XzUC1epWZvJkZW2j+SXQm5k
Wdew==
ARC-Message-Signature: i=1; a=rsa-sha256; c=relaxed/relaxed; d=google.com; s=arc-20240605;
h=to:from:subject:date:message-id:auto-submitted:sender:reply-to
:mime-version:dkim-signature:dkim-signature;
bh=lgr/fFBrye/qM438Us9TAp1/DYWNuYxn2NUL4vzX/SU=;
fh=mbzrMIWIgWMC0ni1xEx+ViW4J0RLAdLdPT2cX81nTlk=;
b=hra/E01IWuIFrWtk3uTcoj04apbHeQcQBSINqYDpr3cO7rXknIvpeXoWLvk0EIJI5y
syt60ekwVnsX/qb2F1HbN896dm97QrEGIwAiJyN2oTFauLoYObpcuhPS317hU4+YubO+
RLUntXsPK2qiifmPCOMPD6wACQB9YXpOPHrrl5x/yZlria1Tfg3XQcZIYsWcU/Qil94x
GtK+i82uzPXEQ0fVieEgJaZtmrW7OFEpPjd1KGp6sYtGvOxUfxVKl5MhLrCqfcLN9fd7
Xren0S32b/IsZA8ASdFca3CNjaAL2Ajlatb39XN17txnKrpQje/ReiVkm9wwo194NwCp
3dfQ==;
dara=google.com
ARC-Authentication-Results: i=1; mx.google.com;
dkim=pass header.i=@google.com header.s=20230601 header.b=RjBRlfFL;
dkim=pass header.i=@gmail.com header.s=20230601 header.b=HaiL0lRL;
spf=pass (google.com: domain of tconvertino@gmail.com designates 209.85.220.73 as permitted sender) smtp.mailfrom=tconvertino@gmail.com;
dmarc=pass (p=NONE sp=QUARANTINE dis=NONE) header.from=gmail.com;
dara=pass header.i=@gmail.com
Received: from mail-sor-f73.google.com (mail-sor-f73.google.com. [209.85.220.73])
by mx.google.com with SMTPS id ca18e2360f4ac-88711b2248fsor90547939f.5.2025.08.28.12.11.09
for <couchmoney@gmail.com>
(Google Transport Security);
Thu, 28 Aug 2025 12:11:09 -0700 (PDT)
Received-SPF: pass (google.com: domain of tconvertino@gmail.com designates 209.85.220.73 as permitted sender) client-ip=209.85.220.73;
Authentication-Results: mx.google.com;
dkim=pass header.i=@google.com header.s=20230601 header.b=RjBRlfFL;
dkim=pass header.i=@gmail.com header.s=20230601 header.b=HaiL0lRL;
spf=pass (google.com: domain of tconvertino@gmail.com designates 209.85.220.73 as permitted sender) smtp.mailfrom=tconvertino@gmail.com;
dmarc=pass (p=NONE sp=QUARANTINE dis=NONE) header.from=gmail.com;
dara=pass header.i=@gmail.com
DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed;
d=google.com; s=20230601; t=1756408269; x=1757013069; dara=google.com;
h=to:from:subject:date:message-id:auto-submitted:sender:reply-to
:mime-version:from:to:cc:subject:date:message-id:reply-to;
bh=lgr/fFBrye/qM438Us9TAp1/DYWNuYxn2NUL4vzX/SU=;
b=RjBRlfFLVsAeeTCwo5Z3c1Y5G+pvz4XSTyHiVKUHmxClmpM30ZeHTVLl36njuM/7rx
mFwbzGk80zXgGpZyc7qnhSIVxXeMv4iex2UIc1D7Rcw3CF4q/HPlulcD9uVnsxRvng5Z
6PVcBQH3qGn0zvDDb0QHEcuDed4sNd/4wkYMOchxlp1TfdrbMZdCI+EXwTyvGgbVjd+/
erPyF5JZL/UJx7+gWoXSE7yJkPQrKYiv4LApu0STV4iSOEL8XsTQ4nZiZHSLeeKr0y7w
TUWhjfOCgD/YTZW5PTuFBW+lI03Ny19iGHbQNwKrLLcGwW7TJ2PYBR90vsIfaJtG5RM6
MP1w==
DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed;
d=gmail.com; s=20230601; t=1756408269; x=1757013069; dara=google.com;
h=to:from:subject:date:message-id:auto-submitted:sender:reply-to
:mime-version:from:to:cc:subject:date:message-id:reply-to;
bh=lgr/fFBrye/qM438Us9TAp1/DYWNuYxn2NUL4vzX/SU=;
b=HaiL0lRLeUjb1Rw8g5U5npEElUjhuKY2dPzOaldvum7ZqfY26X35u8SQTxCXWcSsGp
RKrlHykB6fjjPSjSGBB+uKe98anrorlvgkhUluES0LzmAZ6STVlPUfPHb/RreJQ7Ol1r
N7oNIEg5EnGia1g6rWliSMHY7Fb4sQzMaS2P+qhtq0OFzB6F57atJAwTUWaspDHycfdh
S8ji+q7DEiLq1LfXIxj+WwenT/iRFIJsfmvXsHgQiKMoYGdENfAGZPdo7W0sTEK3TkWz
xFOny/4bQmx/49F4C1HnLsHoBi0j6sezIQsc+U83vvChFXXrELQrK5PiJL+UOCLZo48R
RJDQ==
X-Google-Smtp-Source: AGHT+IG3ta6ofCYBa0SfJ7K3lq1EjsCnjr+BZDRz/SVLQfyo54CcUFgE5iTTB5E+h//QXT9iTojhKpMp6QZ4QB+5HAcs
MIME-Version: 1.0
X-Received: by 2002:a05:6602:1544:b0:887:6a2:6054 with SMTP id
ca18e2360f4ac-88706a263famr584022039f.9.1756408269509; Thu, 28 Aug 2025
12:11:09 -0700 (PDT)
Reply-To: tconvertino@gmail.com
Sender: Google Calendar <calendar-notification@google.com>
Auto-Submitted: auto-generated
Message-ID: <calendar-8ecdd8ef-29ed-4f61-857d-1215ab585aba@google.com>
Date: Thu, 28 Aug 2025 19:11:09 +0000
Subject: New event: Dentist appt @ Tue Sep 23, 2025 3pm - 4pm (PDT) (tconvertino@gmail.com)
From: tconvertino@gmail.com
To: couchmoney@gmail.com
Content-Type: multipart/alternative; boundary="000000000000fc1bff063d71aa4b"
X-Spamd-Result: default: False [-0.80 / 15.00];
ARC_ALLOW(-1.00)[google.com:s=arc-20240605:i=2];
URI_COUNT_ODD(1.00)[1];
DMARC_POLICY_ALLOW(-0.50)[gmail.com,none];
R_DKIM_ALLOW(-0.20)[google.com:s=20230601,gmail.com:s=20230601];
R_SPF_ALLOW(-0.20)[+ip6:2a00:1450:4000::/36];
MIME_BASE64_TEXT(0.10)[];
MANY_INVISIBLE_PARTS(0.10)[2];
MIME_GOOD(-0.10)[multipart/alternative,text/plain];
FREEMAIL_TO(0.00)[gmail.com];
RCVD_COUNT_THREE(0.00)[3];
FORGED_SENDER(0.00)[tconvertino@gmail.com,couchmoney@gmail.com];
FROM_NEQ_ENVFROM(0.00)[tconvertino@gmail.com,couchmoney@gmail.com];
MIME_TRACE(0.00)[0:+,1:+,2:~];
FREEMAIL_ENVFROM(0.00)[gmail.com];
RCPT_COUNT_ONE(0.00)[1];
FREEMAIL_REPLYTO(0.00)[gmail.com];
FREEMAIL_FROM(0.00)[gmail.com];
URIBL_BLOCKED(0.00)[mail-lj1-x230.google.com:rdns,mail-lj1-x230.google.com:helo];
TAGGED_FROM(0.00)[caf_=gmail=xinutv];
HAS_REPLYTO(0.00)[tconvertino@gmail.com];
NEURAL_HAM(-0.00)[-0.995];
FWD_GOOGLE(0.00)[couchmoney@gmail.com];
TO_DN_NONE(0.00)[];
FORGED_SENDER_FORWARDING(0.00)[];
RCVD_TLS_LAST(0.00)[];
TO_DOM_EQ_FROM_DOM(0.00)[];
FROM_NO_DN(0.00)[];
ASN(0.00)[asn:15169, ipnet:2a00:1450::/32, country:US];
DKIM_TRACE(0.00)[google.com:+,gmail.com:+];
MISSING_XM_UA(0.00)[];
REPLYTO_EQ_FROM(0.00)[]
X-Rspamd-Server: phx
X-Rspamd-Action: no action
X-Rspamd-Queue-Id: B4E848B007
X-TUID: eMNiZ49uiDPB
--000000000000fc1bff063d71aa4b
Content-Type: text/plain; charset="UTF-8"; format=flowed; delsp=yes
Content-Transfer-Encoding: base64
RGVudGlzdCBhcHB0DQpUdWVzZGF5IFNlcCAyMywgMjAyNSDii4UgM3BtIOKAkyA0cG0NClBhY2lm
aWMgVGltZSAtIExvcyBBbmdlbGVzDQoNCg0KDQpPcmdhbml6ZXINCnRjb252ZXJ0aW5vQGdtYWls
LmNvbQ0KdGNvbnZlcnRpbm9AZ21haWwuY29tDQoNCn5+Ly9+fg0KSW52aXRhdGlvbiBmcm9tIEdv
b2dsZSBDYWxlbmRhcjogaHR0cHM6Ly9jYWxlbmRhci5nb29nbGUuY29tL2NhbGVuZGFyLw0KDQpZ
b3UgYXJlIHJlY2VpdmluZyB0aGlzIGVtYWlsIGJlY2F1c2UgeW91IGFyZSBzdWJzY3JpYmVkIHRv
IGNhbGVuZGFyICANCm5vdGlmaWNhdGlvbnMuIFRvIHN0b3AgcmVjZWl2aW5nIHRoZXNlIGVtYWls
cywgZ28gdG8gIA0KaHR0cHM6Ly9jYWxlbmRhci5nb29nbGUuY29tL2NhbGVuZGFyL3Ivc2V0dGlu
Z3MsIHNlbGVjdCB0aGlzIGNhbGVuZGFyLCBhbmQgIA0KY2hhbmdlICJPdGhlciBub3RpZmljYXRp
b25zIi4NCg0KRm9yd2FyZGluZyB0aGlzIGludml0YXRpb24gY291bGQgYWxsb3cgYW55IHJlY2lw
aWVudCB0byBzZW5kIGEgcmVzcG9uc2UgdG8gIA0KdGhlIG9yZ2FuaXplciwgYmUgYWRkZWQgdG8g
dGhlIGd1ZXN0IGxpc3QsIGludml0ZSBvdGhlcnMgcmVnYXJkbGVzcyBvZiAgDQp0aGVpciBvd24g
aW52aXRhdGlvbiBzdGF0dXMsIG9yIG1vZGlmeSB5b3VyIFJTVlAuDQoNCkxlYXJuIG1vcmUgaHR0
cHM6Ly9zdXBwb3J0Lmdvb2dsZS5jb20vY2FsZW5kYXIvYW5zd2VyLzM3MTM1I2ZvcndhcmRpbmcN
Cg==
--000000000000fc1bff063d71aa4b
Content-Type: text/html; charset="UTF-8"
Content-Transfer-Encoding: quoted-printable
<!doctype html><html xmlns=3D"http://www.w3.org/1999/xhtml" xmlns:v=3D"urn:="...truncated for brevity...

View File

@@ -0,0 +1,169 @@
Return-Path: <couchmoney+caf_=gmail=xinu.tv@gmail.com>
Delivered-To: bill@xinu.tv
Received: from phx.xinu.tv [74.207.253.222]
by nixos-01.h.xinu.tv with IMAP (fetchmail-6.4.39)
for <wathiede@localhost> (single-drop); Mon, 02 Jun 2025 07:06:34 -0700 (PDT)
Received: from phx.xinu.tv
by phx.xinu.tv with LMTP
id qDo+FuqvPWh51xIAJR8clQ
(envelope-from <couchmoney+caf_=gmail=xinu.tv@gmail.com>)
for <bill@xinu.tv>; Mon, 02 Jun 2025 07:06:34 -0700
X-Original-To: gmail@xinu.tv
Received-SPF: Pass (mailfrom) identity=mailfrom; client-ip=2a00:1450:4864:20::130; helo=mail-lf1-x130.google.com; envelope-from=couchmoney+caf_=gmail=xinu.tv@gmail.com; receiver=xinu.tv
Authentication-Results: phx.xinu.tv;
dkim=pass (2048-bit key; unprotected) header.d=google.com header.i=@google.com header.a=rsa-sha256 header.s=20230601 header.b=zT2yUtVH;
dkim=pass (2048-bit key; unprotected) header.d=gmail.com header.i=@gmail.com header.a=rsa-sha256 header.s=20230601 header.b=nmJW8N67
Received: from mail-lf1-x130.google.com (mail-lf1-x130.google.com [IPv6:2a00:1450:4864:20::130])
by phx.xinu.tv (Postfix) with ESMTPS id 912AC80034
for <gmail@xinu.tv>; Mon, 02 Jun 2025 07:06:32 -0700 (PDT)
Received: by mail-lf1-x130.google.com with SMTP id 2adb3069b0e04-54e7967cf67so5267078e87.0
for <gmail@xinu.tv>; Mon, 02 Jun 2025 07:06:32 -0700 (PDT)
ARC-Seal: i=2; a=rsa-sha256; t=1748873190; cv=pass;
d=google.com; s=arc-20240605;
b=W3s0wT+CV1W21AldY9lfxPlKRbc7XMoorEnilNq5iGjlw18vDM6eFPb+btqaGAPOPe
CMyGeinsFPuql+S7u6HgjZcf9ZFH71sKoFoQytm30hAXB76GO06qi1jRW6o0miuGt/j/
bb8qWAiAsGr34mHIbE5fBdkNOGcqW85oI78GolLqpROgn/42boEYxiGAQjybPtO4L84J
wP2RBkHiQQGXUjL6b02tozCji1w2XdfYqtW8RteUs1pqYdXl4GUilMLt5C0d2bhSGksS
3tMTFjuycbaj+F6QFCkQfEsHx/I7GjuD4mToLcYpzrNnmZZUidAoKuh+uin0cEVvnQ1j
V8aA==
ARC-Message-Signature: i=2; a=rsa-sha256; c=relaxed/relaxed; d=google.com; s=arc-20240605;
h=to:from:subject:date:message-id:auto-submitted:sender:reply-to
:mime-version:dkim-signature:dkim-signature:delivered-to;
bh=dgRmOj3aABlB3SNw+xxlI8L9ugJFZ1WMJrtLw/W8tnA=;
fh=5zy5Gi9ngAea7dC9ZKKPh/BZlFmotJq74g9KHrEIwaE=;
b=QTAjqit0gYnuGa1lbO9RUXOVpyutliNo+tG6irWFsjGhnvMkis2KdLb6saYPnLCG7F
rSRXvw0HwuaJfXAV3XvIT0pxTg3PXYnc8kt/F8OtG+LiakJbMV1soj8OJ+5lZPKFmvna
i2T5mJjEknZsc9qWYmaAEVqIg71jhPH5CjJyehNhsIJ1/O9CH4VF8L0yv9KUMAA4tzog
LfI+SpOE2z/wYuMDxi2Ld3FgaVCQgkMM2Tlys8P0DjCaewWeaZFmZKIEEZUbKWbrivTa
RSO+Us+9yrt8hDdJuvtf9eXsGvuZtdj/2APRts/0cd7SFAQqRd0DnhGIHoXR74YVHaqi
U7IQ==;
darn=xinu.tv
ARC-Authentication-Results: i=2; mx.google.com;
dkim=pass header.i=@google.com header.s=20230601 header.b=zT2yUtVH;
dkim=pass header.i=@gmail.com header.s=20230601 header.b=nmJW8N67;
spf=pass (google.com: domain of tconvertino@gmail.com designates 209.85.220.73 as permitted sender) smtp.mailfrom=tconvertino@gmail.com;
dmarc=pass (p=NONE sp=QUARANTINE dis=NONE) header.from=gmail.com;
dara=pass header.i=@gmail.com
X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed;
d=1e100.net; s=20230601; t=1748873190; x=1749477990;
h=to:from:subject:date:message-id:auto-submitted:sender:reply-to
:mime-version:dkim-signature:dkim-signature:delivered-to
:x-forwarded-for:x-forwarded-to:x-gm-message-state:from:to:cc
:subject:date:message-id:reply-to;
bh=dgRmOj3aABlB3SNw+xxlI8L9ugJFZ1WMJrtLw/W8tnA=;
b=dBjp6JdmFUj0jKPDo9r2/xvfVSvxKaF15UYwYU7itdM18qpCnrgQdHMP2ST7EQBxou
58yZfVjrx84gg9phedpVSg4SaBaPIhXsLuUeVQZtPd7J3WYiH4+OGcecjV+cD0dG0TUi
o/FbZULNl3REysvoAj+AwUL/ny2FnNU4PIhkeSq+d6iNztkexIKLS8qWqHosenPlVX+E
Z7OGQZpK6m1LB5UbCsaODQq5wbNIxlOxqTP1rCHe/hHk53ljiNegzaOS31mVvp1n8/g1
pWIZltyZORs0zi6U9+mNd9ZbaeQjHqBrcb2bsTxCD+u0DBuF2RjLguS/feaB25TG8LAg
szYg==
X-Forwarded-Encrypted: i=2; AJvYcCXfGRAIDqrPsT1vzTMSiuMrlTj/DbRrr+8w7X+iLRH2XK/n8MZhV3UaT0Zia6c6jMrf3s3eHA==@xinu.tv
X-Gm-Message-State: AOJu0YxOQEmNiUg4NKf4NM1BgQMqTJaFM6txPnL6u74ff1dZvoSgTC4d
TtJJqfdHsajxloSGDsSPqIQ/M/Se/sfymEExFQxDXYA/XasA6+sdye/Ihl9QekGJK9jet1VtQ3r
dcg89xnFcxezg3ji6xH8jnSULlp350K9K7LR0LfTQqg6e/BEKEF8XDaNgmJC+RQ==
X-Received: by 2002:a05:6512:2246:b0:553:35bb:f7b7 with SMTP id 2adb3069b0e04-55342f92776mr2472199e87.32.1748873190333;
Mon, 02 Jun 2025 07:06:30 -0700 (PDT)
X-Forwarded-To: gmail@xinu.tv
X-Forwarded-For: couchmoney@gmail.com gmail@xinu.tv
Delivered-To: couchmoney@gmail.com
Received: by 2002:ab3:7457:0:b0:2b1:14e:dc2b with SMTP id g23csp2818972lti;
Mon, 2 Jun 2025 07:06:29 -0700 (PDT)
X-Received: by 2002:a05:6602:6a8b:b0:86c:f898:74b8 with SMTP id ca18e2360f4ac-86d0521552emr1082401939f.10.1748873188734;
Mon, 02 Jun 2025 07:06:28 -0700 (PDT)
ARC-Seal: i=1; a=rsa-sha256; t=1748873188; cv=none;
d=google.com; s=arc-20240605;
b=d2PNXrTE3VYjml3FmbC5rBW6XnsyuyVO3lPyM6VoVKFcvZ7a8tDRB+sh1ibo0D5Nvg
3i/Qon0RV401WFb9NQf5P048wpj19G8bOGPZUKMioBZcSxkr1RwH/GW6GBvGS+d+iqbW
43KWc6Px7RGOEeYfp8D88CuJ/5kMcsLMfDV1FRHo6T+chVY6c9fQkHjRreSGQcFXglt5
yaCpFKkAODO7rSHl2OW2kQ6eGgR0tUjb95+jdZXoU0GS3119CBYK9n9UhNaeXHIk/Zyy
f08r4Ce/m3Y6ISr4ovXxDeYNpeeUN1HT3XVyCVQJHjfWrHypKTiOt4q6yBhCgOgZTXJq
pL5A==;
ARC-Message-Signature: i=1; a=rsa-sha256; c=relaxed/relaxed; d=google.com; s=arc-20240605;
h=to:from:subject:date:message-id:auto-submitted:sender:reply-to
:mime-version:dkim-signature:dkim-signature;
bh=dgRmOj3aABlB3SNw+xxlI8L9ugJFZ1WMJrtLw/W8tnA=;
fh=mbzrMIWIgWMC0ni1xEx+ViW4J0RLAdLdPT2cX81nTlk=;
b=YiMakYeE05UctWy9sW90/a3l1Hk1pAPv0+fpk5vmWrADcMwwI8cHVqBp+Nxds5psWa
a/zrw9UlxV4HgjLUP+ella/pK8XxK+sitKg0IhPOntwKbq1KfTNheufh4HtWj5yWedHE
sO/dVs6z/EW/gWrfBK/3JMgsnz3HrHmaoJ6caCaGI6t5jHxEXI+eJc5zILY+n0MdivkX
tJOo0L1s/k6MAdyLr4/IVqpxdhXbUPq44twCBNheHd8T5w1DC9ZXcr54X79fW8Vzbm8/
A++H3gnZRGtOayRySYQl04LFLk4YsisdhsKuaJV+WKYCW58wQqJT04mrVkx+m96qr1q0
BQtw==;
dara=google.com
ARC-Authentication-Results: i=1; mx.google.com;
dkim=pass header.i=@google.com header.s=20230601 header.b=zT2yUtVH;
dkim=pass header.i=@gmail.com header.s=20230601 header.b=nmJW8N67;
spf=pass (google.com: domain of tconvertino@gmail.com designates 209.85.220.73 as permitted sender) smtp.mailfrom=tconvertino@gmail.com;
dmarc=pass (p=NONE sp=QUARANTINE dis=NONE) header.from=gmail.com;
dara=pass header.i=@gmail.com
Received: from mail-sor-f73.google.com (mail-sor-f73.google.com. [209.85.220.73])
by mx.google.com with SMTPS id ca18e2360f4ac-86d0213d491sor465078439f.8.2025.06.02.07.06.28
for <couchmoney@gmail.com>
(Google Transport Security);
Mon, 02 Jun 2025 07:06:28 -0700 (PDT)
Received-SPF: pass (google.com: domain of tconvertino@gmail.com designates 209.85.220.73 as permitted sender) client-ip=209.85.220.73;
Authentication-Results: mx.google.com;
dkim=pass header.i=@google.com header.s=20230601 header.b=zT2yUtVH;
dkim=pass header.i=@gmail.com header.s=20230601 header.b=nmJW8N67;
spf=pass (google.com: domain of tconvertino@gmail.com designates 209.85.220.73 as permitted sender) smtp.mailfrom=tconvertino@gmail.com;
dmarc=pass (p=NONE sp=QUARANTINE dis=NONE) header.from=gmail.com;
dara=pass header.i=@gmail.com
DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed;
d=google.com; s=20230601; t=1748873188; x=1749477988; dara=google.com;
h=to:from:subject:date:message-id:auto-submitted:sender:reply-to
:mime-version:from:to:cc:subject:date:message-id:reply-to;
bh=dgRmOj3aABlB3SNw+xxlI8L9ugJFZ1WMJrtLw/W8tnA=;
b=zT2yUtVHhNy5fFiy6YKzfYCQPlCnufAEoWmbvjvj7mFNYUlLJHZ5FUeNnDs06Z1icR
bSVtejKixrz4hjFh9KeKvV9EQNGU7UFgySwqdy6szm+sHZQj+iJAXy85A1QaL6+0Swup
2y8QsjVJ96uugM0SaAYZqe+lmLBk6zFWqkg0U37vgwOupAcNsNBd7tos7cxO5eK6Aops
FJjr9JAD+ddX03ngH9zfnvlNV/+qbmiP6Hs8OmaJtZof2GLucpHgqUpIdolCh7F72v4p
DibO4RShI/IQCw9ejZxhRPBPWQwIdOYLjD/sDunX63M4NCS/63jZfhwqsAVgtmN/cUGq
spHQ==
DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed;
d=gmail.com; s=20230601; t=1748873188; x=1749477988; dara=google.com;
h=to:from:subject:date:message-id:auto-submitted:sender:reply-to
:mime-version:from:to:cc:subject:date:message-id:reply-to;
bh=dgRmOj3aABlB3SNw+xxlI8L9ugJFZ1WMJrtLw/W8tnA=;
b=nmJW8N67IylgMNprzzf/IC7V2r7xeY0+8Bl0KcAak6Xly+IhVv3nyccvgdKsp+8Ccd
NcikfVOtCsE3gTqviReUbTAKy7PyClAbBTEHC0Ne71549BN+v8zX64RpGDFJGX5pJMG5
r0Ak88nxzjWkvDLhlnHmWdt/NggdQEI6T7oP4VZo0f0/Ym7g1WJhSItfdIhSRDNzK3ed
WPRXUIb1sW3+N0My4Os6L4IA9kdRk5z0qpQxtsIL9N0dzv4q18q6eH3KfTzVPr59PsYT
uSgkWoLQZdfA70MMlIRU5CnGbVDRH4TO/ib433vIblOmtLTkQ4EaOTzncbs0tovVes4z
evsQ==
X-Google-Smtp-Source: AGHT+IETNpLvkLm7t8VAdDcEcVtxFCttPh/uVZhoQCRlhUNlx9bmg67olJiD9EOND8g0z43NnM8iK4FxezZondExIawx
MIME-Version: 1.0
X-Received: by 2002:a05:6602:4183:b0:864:4a1b:dfc5 with SMTP id
ca18e2360f4ac-86d052154eamr1431889339f.9.1748873188195; Mon, 02 Jun 2025
07:06:28 -0700 (PDT)
Reply-To: tconvertino@gmail.com
Sender: Google Calendar <calendar-notification@google.com>
Auto-Submitted: auto-generated
Message-ID: <calendar-093be1c9-5d94-4994-8bc5-7daa1cfae47b@google.com>
Date: Mon, 02 Jun 2025 14:06:28 +0000
Subject: New event: Tamara and Scout in Alaska @ Tue Jun 24 - Mon Jun 30, 2025 (tconvertino@gmail.com)
From: tconvertino@gmail.com
To: couchmoney@gmail.com
Content-Type: multipart/alternative; boundary="00000000000023c70606369745e9"
--00000000000023c70606369745e9
Content-Type: text/plain; charset="UTF-8"; format=flowed; delsp=yes
Content-Transfer-Encoding: base64
VGFtYXJhIGFuZCBTY291dCBpbiBBbGFza2ENClR1ZXNkYXkgSnVuIDI0IOKAkyBNb25kYXkgSnVu
IDMwLCAyMDI1DQoNCg0KDQpPcmdhbml6ZXINCnRjb252ZXJ0aW5vQGdtYWlsLmNvbQ0KdGNvbnZl
cnRpbm9AZ21haWwuY29tDQoNCn5+Ly9+fg0KSW52aXRhdGlvbiBmcm9tIEdvb2dsZSBDYWxlbmRh
cjogaHR0cHM6Ly9jYWxlbmRhci5nb29nbGUuY29tL2NhbGVuZGFyLw0KDQpZb3UgYXJlIHJlY2Vp
dmluZyB0aGlzIGVtYWlsIGJlY2F1c2UgeW91IGFyZSBzdWJzY3JpYmVkIHRvIGNhbGVuZGFyICAN
Cm5vdGlmaWNhdGlvbnMuIFRvIHN0b3AgcmVjZWl2aW5nIHRoZXNlIGVtYWlscywgZ28gdG8gIA0K
aHR0cHM6Ly9jYWxlbmRhci5nb29nbGUuY29tL2NhbGVuZGFyL3Ivc2V0dGluZ3MsIHNlbGVjdCB0
aGlzIGNhbGVuZGFyLCBhbmQgIA0KY2hhbmdlICJPdGhlciBub3RpZmljYXRpb25zIi4NCg0KRm9y
d2FyZGluZyB0aGlzIGludml0YXRpb24gY291bGQgYWxsb3cgYW55IHJlY2lwaWVudCB0byBzZW5k
IGEgcmVzcG9uc2UgdG8gIA0KdGhlIG9yZ2FuaXplciwgYmUgYWRkZWQgdG8gdGhlIGd1ZXN0IGxp
c3QsIGludml0ZSBvdGhlcnMgcmVnYXJkbGVzcyBvZiAgDQp0aGVpciBvd24gaW52aXRhdGlvbiBz
dGF0dXMsIG9yIG1vZGlmeSB5b3VyIFJTVlAuDQoNCkxlYXJuIG1vcmUgaHR0cHM6Ly9zdXBwb3J0
Lmdvb2dsZS5jb20vY2FsZW5kYXIvYW5zd2VyLzM3MTM1I2ZvcndhcmRpbmcNCg==
--00000000000023c70606369745e9
Content-Type: text/html; charset="UTF-8"
Content-Transfer-Encoding: quoted-printable
<!doctype html><html xmlns=3D"http://www.w3.org/1999/xhtml" xmlns:v=3D"urn:="...truncated for brevity...

57
server/testdata/ical-example-1.ics vendored Normal file
View File

@@ -0,0 +1,57 @@
BEGIN:VCALENDAR
METHOD:REQUEST
PRODID:Microsoft Exchange Server 2010
VERSION:2.0
BEGIN:VTIMEZONE
TZID:Pacific Standard Time
BEGIN:STANDARD
DTSTART:16010101T020000
TZOFFSETFROM:-0700
TZOFFSETTO:-0800
RRULE:FREQ=YEARLY;INTERVAL=1;BYDAY=1SU;BYMONTH=11
END:STANDARD
BEGIN:DAYLIGHT
DTSTART:16010101T020000
TZOFFSETFROM:-0800
TZOFFSETTO:-0700
RRULE:FREQ=YEARLY;INTERVAL=1;BYDAY=2SU;BYMONTH=3
END:DAYLIGHT
END:VTIMEZONE
BEGIN:VEVENT
ORGANIZER;CN=Bill Thiede:mailto:wthiede@nvidia.com
ATTENDEE;ROLE=REQ-PARTICIPANT;PARTSTAT=NEEDS-ACTION;RSVP=TRUE;CN=Bill:mailt
o:couchmoney@gmail.com
DESCRIPTION;LANGUAGE=en-US:\n
UID:040000008200E00074C5B7101A82E00800000000A1458AEA8E4DDB01000000000000000
010000000988BC323BE65A8458B718B5EF8FE8152
SUMMARY;LANGUAGE=en-US:dentist night guard
DTSTART;TZID=Pacific Standard Time:20250108T080000
DTEND;TZID=Pacific Standard Time:20250108T090000
CLASS:PUBLIC
PRIORITY:5
DTSTAMP:20241213T184408Z
TRANSP:OPAQUE
STATUS:CONFIRMED
SEQUENCE:0
LOCATION;LANGUAGE=en-US:
X-MICROSOFT-CDO-APPT-SEQUENCE:0
X-MICROSOFT-CDO-OWNERAPPTID:2123132523
X-MICROSOFT-CDO-BUSYSTATUS:TENTATIVE
X-MICROSOFT-CDO-INTENDEDSTATUS:BUSY
X-MICROSOFT-CDO-ALLDAYEVENT:FALSE
X-MICROSOFT-CDO-IMPORTANCE:1
X-MICROSOFT-CDO-INSTTYPE:0
X-MICROSOFT-ONLINEMEETINGEXTERNALLINK:
X-MICROSOFT-ONLINEMEETINGCONFLINK:
X-MICROSOFT-DONOTFORWARDMEETING:FALSE
X-MICROSOFT-DISALLOW-COUNTER:FALSE
X-MICROSOFT-REQUESTEDATTENDANCEMODE:DEFAULT
X-MICROSOFT-ISRESPONSEREQUESTED:TRUE
X-MICROSOFT-LOCATIONS:[]
BEGIN:VALARM
DESCRIPTION:REMINDER
TRIGGER;RELATED=START:-PT5M
ACTION:DISPLAY
END:VALARM
END:VEVENT
END:VCALENDAR

30
server/testdata/ical-example-2.ics vendored Normal file
View File

@@ -0,0 +1,30 @@
BEGIN:VCALENDAR
PRODID:-//Google Inc//Google Calendar 70.9054//EN
VERSION:2.0
CALSCALE:GREGORIAN
METHOD:REPLY
X-GOOGLE-CALID:g66m0feuqsao8l1c767pvvcg4k@group.calendar.google.com
BEGIN:VEVENT
DTSTART:20250813T010000Z
DTEND:20250813T030000Z
DTSTAMP:20250801T022550Z
ORGANIZER;CN=Family:mailto:g66m0feuqsao8l1c767pvvcg4k@group.calendar.google
.com
UID:6os3ap346th6ab9nckp30b9kc8sm2bb160q3gb9l6lgm6or160rjee1mco@google.com
ATTENDEE;CUTYPE=INDIVIDUAL;ROLE=REQ-PARTICIPANT;PARTSTAT=ACCEPTED;CN=superm
atute@gmail.com;X-NUM-GUESTS=0:mailto:supermatute@gmail.com
X-GOOGLE-CONFERENCE:https://meet.google.com/dcu-hykx-vym
CREATED:20250801T015712Z
DESCRIPTION:-::~:~::~:~:~:~:~:~:~:~:~:~:~:~:~:~:~:~:~:~:~:~:~:~:~:~:~:~:~:~
:~:~:~:~:~:~:~:~::~:~::-\nJoin with Google Meet: https://meet.google.com/dc
u-hykx-vym\n\nLearn more about Meet at: https://support.google.com/a/users/
answer/9282720\n\nPlease do not edit this section.\n-::~:~::~:~:~:~:~:~:~:~
:~:~:~:~:~:~:~:~:~:~:~:~:~:~:~:~:~:~:~:~:~:~:~:~:~:~:~:~::~:~::-
LAST-MODIFIED:20250801T022549Z
LOCATION:
SEQUENCE:0
STATUS:CONFIRMED
SUMMARY:[tenative] dinner w/ amatute
TRANSP:OPAQUE
END:VEVENT
END:VCALENDAR

9
server/testdata/ical-multiday.ics vendored Normal file
View File

@@ -0,0 +1,9 @@
BEGIN:VCALENDAR
VERSION:2.0
BEGIN:VEVENT
SUMMARY:Multi-day Event
DTSTART;VALUE=DATE:20250828
DTEND;VALUE=DATE:20250831
DESCRIPTION:This event spans multiple days.
END:VEVENT
END:VCALENDAR

36
server/testdata/ical-straddle-real.ics vendored Normal file
View File

@@ -0,0 +1,36 @@
BEGIN:VCALENDAR
PRODID:-//Google Inc//Google Calendar 70.9054//EN
VERSION:2.0
CALSCALE:GREGORIAN
METHOD:REQUEST
BEGIN:VEVENT
DTSTART;VALUE=DATE:20250830
DTEND;VALUE=DATE:20250902
DTSTAMP:20250819T183713Z
ORGANIZER;CN=Bill Thiede:mailto:couchmoney@gmail.com
UID:37kplskaimjnhdnt8r5ui9pv7f@google.com
ATTENDEE;CUTYPE=INDIVIDUAL;ROLE=REQ-PARTICIPANT;PARTSTAT=NEEDS-ACTION;RSVP=
TRUE;CN=bill@xinu.tv;X-NUM-GUESTS=0:mailto:bill@xinu.tv
ATTENDEE;CUTYPE=INDIVIDUAL;ROLE=REQ-PARTICIPANT;PARTSTAT=ACCEPTED;RSVP=TRUE
;CN=Bill Thiede;X-NUM-GUESTS=0:mailto:couchmoney@gmail.com
X-MICROSOFT-CDO-OWNERAPPTID:1427505964
CREATED:20250819T183709Z
DESCRIPTION:
LAST-MODIFIED:20250819T183709Z
LOCATION:
SEQUENCE:0
STATUS:CONFIRMED
SUMMARY:Test Straddle Month
TRANSP:TRANSPARENT
BEGIN:VALARM
ACTION:DISPLAY
DESCRIPTION:This is an event reminder
TRIGGER:-P0DT0H30M0S
END:VALARM
BEGIN:VALARM
ACTION:DISPLAY
DESCRIPTION:This is an event reminder
TRIGGER:-P0DT7H30M0S
END:VALARM
END:VEVENT
END:VCALENDAR

13
server/testdata/ical-straddle.ics vendored Normal file
View File

@@ -0,0 +1,13 @@
BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Test Recurring Event//EN
BEGIN:VEVENT
UID:recurring-test-1@example.com
DTSTART;VALUE=DATE:20250804
DTEND;VALUE=DATE:20250805
RRULE:FREQ=WEEKLY;BYDAY=MO,WE,FR;UNTIL=20250825T000000Z
SUMMARY:Test Recurring Event (Mon, Wed, Fri)
DESCRIPTION:This event recurs every Monday, Wednesday, and Friday in August 2025.
END:VEVENT
END:VCALENDAR

View File

@@ -1,10 +1,20 @@
[package]
name = "shared"
version = "0.1.0"
edition = "2021"
name = "letterbox-shared"
description = "Shared module for letterbox"
authors.workspace = true
edition.workspace = true
license.workspace = true
publish.workspace = true
repository.workspace = true
version.workspace = true
# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html
[dependencies]
notmuch = { path = "../notmuch" }
serde = { version = "1.0.147", features = ["derive"] }
build-info = "0.0.41"
letterbox-notmuch = { path = "../notmuch", version = "0.17.40", registry = "xinu" }
regex = "1.11.1"
serde = { version = "1.0.219", features = ["derive"] }
sqlx = "0.8.5"
strum_macros = "0.27.1"
tracing = "0.1.41"

View File

@@ -1,5 +1,14 @@
use notmuch::SearchSummary;
use std::{
convert::Infallible,
hash::{DefaultHasher, Hash, Hasher},
str::FromStr,
};
use build_info::{BuildInfo, VersionControl};
use letterbox_notmuch::SearchSummary;
use regex::{RegexBuilder, RegexSetBuilder};
use serde::{Deserialize, Serialize};
use tracing::debug;
#[derive(Serialize, Deserialize, Debug)]
pub struct SearchResult {
@@ -9,3 +18,251 @@ pub struct SearchResult {
pub results_per_page: usize,
pub total: usize,
}
#[derive(Serialize, Deserialize, Debug, strum_macros::Display)]
pub enum WebsocketMessage {
RefreshMessages,
}
pub mod urls {
pub const MOUNT_POINT: &'static str = "/api";
pub fn view_original(host: Option<&str>, id: &str) -> String {
if let Some(host) = host {
format!("//{host}/api/original/{id}")
} else {
format!("/api/original/{id}")
}
}
pub fn cid_prefix(host: Option<&str>, cid: &str) -> String {
if let Some(host) = host {
format!("//{host}/api/cid/{cid}/")
} else {
format!("/api/cid/{cid}/")
}
}
pub fn download_attachment(host: Option<&str>, id: &str, idx: &str, filename: &str) -> String {
if let Some(host) = host {
format!(
"//{host}/api/download/attachment/{}/{}/{}",
id, idx, filename
)
} else {
format!("/api/download/attachment/{}/{}/{}", id, idx, filename)
}
}
}
pub fn build_version(bi: fn() -> &'static BuildInfo) -> String {
fn commit(git: &Option<VersionControl>) -> String {
let Some(VersionControl::Git(git)) = git else {
return String::new();
};
let mut s = vec!["-".to_string(), git.commit_short_id.clone()];
if let Some(branch) = &git.branch {
s.push(format!(" ({branch})"));
}
s.join("")
}
let bi = bi();
format!("v{}{}", bi.crate_info.version, commit(&bi.version_control)).to_string()
}
pub fn compute_color(data: &str) -> String {
let mut hasher = DefaultHasher::new();
data.hash(&mut hasher);
format!("#{:06x}", hasher.finish() % (1 << 24))
}
#[derive(
Copy, Clone, Debug, Default, PartialEq, Eq, Hash, Ord, PartialOrd, Serialize, Deserialize,
)]
pub enum MatchType {
From,
Sender,
To,
Cc,
Subject,
ListId,
DeliveredTo,
XForwardedTo,
ReplyTo,
XOriginalTo,
XSpam,
Body,
#[default]
Unknown,
}
#[derive(Debug, Default, Serialize, Deserialize)]
pub struct Match {
pub match_type: MatchType,
pub needle: String,
}
#[derive(Debug, Default, Serialize, Deserialize)]
pub struct Rule {
pub stop_on_match: bool,
pub matches: Vec<Match>,
pub tag: String,
}
impl Rule {
pub fn is_match(&self, header_key: &str, header_value: &str) -> bool {
let pats: Vec<_> = self
.matches
.iter()
.filter_map(|m| match m.match_type {
MatchType::To => Some("^(to|cc|bcc|x-original-to)$"),
MatchType::From => Some("^from$"),
MatchType::Sender => Some("^sender$"),
MatchType::Subject => Some("^subject$"),
MatchType::ListId => Some("^list-id$"),
MatchType::XOriginalTo => Some("^x-original-to$"),
MatchType::ReplyTo => Some("^reply-to$"),
MatchType::XSpam => Some("^x-spam$"),
MatchType::Body => None,
c => panic!("TODO handle '{c:?}' match type"),
})
.collect();
let set = RegexSetBuilder::new(&pats)
.case_insensitive(true)
.build()
.expect("failed to compile regex for matches");
let matches: Vec<_> = set.matches(header_key).into_iter().collect();
if !matches.is_empty() {
//info!("matched key '{header_key}' '{header_value}'");
for m_idx in matches {
let needle = regex::escape(&self.matches[m_idx].needle);
let pat = RegexBuilder::new(&needle)
.case_insensitive(true)
.build()
.expect("failed to compile regex for needle");
if pat.is_match(header_value) {
debug!("{header_key} matched {header_value} against {needle}");
return true;
}
}
}
false
}
}
mod matches {
// From https://linux.die.net/man/5/procmailrc
// If the regular expression contains '^TO_' it will be substituted by '(^((Original-)?(Resent-)?(To|Cc|Bcc)|(X-Envelope |Apparently(-Resent)?)-To):(.*[^-a-zA-Z0-9_.])?)'
// If the regular expression contains '^TO' it will be substituted by '(^((Original-)?(Resent-)?(To|Cc|Bcc)|(X-Envelope |Apparently(-Resent)?)-To):(.*[^a-zA-Z])?)', which should catch all destination specifications containing a specific word.
pub const TO: &'static str = "TO";
pub const CC: &'static str = "Cc";
pub const TOCC: &'static str = "(TO|Cc)";
pub const FROM: &'static str = "From";
pub const SENDER: &'static str = "Sender";
pub const SUBJECT: &'static str = "Subject";
pub const DELIVERED_TO: &'static str = "Delivered-To";
pub const X_FORWARDED_TO: &'static str = "X-Forwarded-To";
pub const REPLY_TO: &'static str = "Reply-To";
pub const X_ORIGINAL_TO: &'static str = "X-Original-To";
pub const LIST_ID: &'static str = "List-ID";
pub const X_SPAM: &'static str = "X-Spam";
pub const X_SPAM_FLAG: &'static str = "X-Spam-Flag";
}
impl FromStr for Match {
type Err = Infallible;
fn from_str(s: &str) -> Result<Self, Self::Err> {
// Examples:
// "* 1^0 ^TOsonyrewards.com@xinu.tv"
// "* ^TOsonyrewards.com@xinu.tv"
let mut it = s.split_whitespace().skip(1);
let mut needle = it.next().unwrap();
if needle == "1^0" {
needle = it.next().unwrap();
}
let mut needle = vec![needle];
needle.extend(it);
let needle = needle.join(" ");
let first = needle.chars().nth(0).unwrap_or(' ');
use matches::*;
if first == '^' {
let needle = &needle[1..];
if needle.starts_with(TO) {
return Ok(Match {
match_type: MatchType::To,
needle: cleanup_match(TO, needle),
});
} else if needle.starts_with(FROM) {
return Ok(Match {
match_type: MatchType::From,
needle: cleanup_match(FROM, needle),
});
} else if needle.starts_with(CC) {
return Ok(Match {
match_type: MatchType::Cc,
needle: cleanup_match(CC, needle),
});
} else if needle.starts_with(TOCC) {
return Ok(Match {
match_type: MatchType::To,
needle: cleanup_match(TOCC, needle),
});
} else if needle.starts_with(SENDER) {
return Ok(Match {
match_type: MatchType::Sender,
needle: cleanup_match(SENDER, needle),
});
} else if needle.starts_with(SUBJECT) {
return Ok(Match {
match_type: MatchType::Subject,
needle: cleanup_match(SUBJECT, needle),
});
} else if needle.starts_with(X_ORIGINAL_TO) {
return Ok(Match {
match_type: MatchType::XOriginalTo,
needle: cleanup_match(X_ORIGINAL_TO, needle),
});
} else if needle.starts_with(LIST_ID) {
return Ok(Match {
match_type: MatchType::ListId,
needle: cleanup_match(LIST_ID, needle),
});
} else if needle.starts_with(REPLY_TO) {
return Ok(Match {
match_type: MatchType::ReplyTo,
needle: cleanup_match(REPLY_TO, needle),
});
} else if needle.starts_with(X_SPAM_FLAG) {
return Ok(Match {
match_type: MatchType::XSpam,
needle: '*'.to_string(),
});
} else if needle.starts_with(X_SPAM) {
return Ok(Match {
match_type: MatchType::XSpam,
needle: '*'.to_string(),
});
} else if needle.starts_with(DELIVERED_TO) {
return Ok(Match {
match_type: MatchType::DeliveredTo,
needle: cleanup_match(DELIVERED_TO, needle),
});
} else if needle.starts_with(X_FORWARDED_TO) {
return Ok(Match {
match_type: MatchType::XForwardedTo,
needle: cleanup_match(X_FORWARDED_TO, needle),
});
} else {
unreachable!("needle: '{needle}'")
}
} else {
return Ok(Match {
match_type: MatchType::Body,
needle: cleanup_match("", &needle),
});
}
}
}
fn unescape(s: &str) -> String {
s.replace('\\', "")
}
pub fn cleanup_match(prefix: &str, s: &str) -> String {
unescape(&s[prefix.len()..]).replace(".*", "")
}

View File

@@ -1,39 +1,58 @@
[package]
version = "0.1.0"
name = "letterbox"
repository = "https://github.com/seed-rs/seed-quickstart"
authors = ["Bill Thiede <git@xinu.tv>"]
description = "App Description"
categories = ["category"]
license = "MIT"
readme = "./README.md"
edition = "2018"
name = "letterbox-web"
description = "Web frontend for letterbox"
authors.workspace = true
edition.workspace = true
license.workspace = true
publish.workspace = true
repository.workspace = true
version.workspace = true
[lib]
crate-type = ["cdylib"]
[build-dependencies]
build-info-build = "0.0.41"
[dev-dependencies]
wasm-bindgen-test = "0.3.33"
wasm-bindgen-test = "0.3.50"
[dependencies]
console_error_panic_hook = "0.1.7"
log = "0.4.17"
seed = "0.9.2"
console_log = {git = "http://git-private.h.xinu.tv/wathiede/console_log.git"}
serde = { version = "1.0.147", features = ["derive"] }
notmuch = {path = "../notmuch"}
shared = {path = "../shared"}
itertools = "0.10.5"
serde_json = { version = "1.0.93", features = ["unbounded_depth"] }
wasm-timer = "0.2.5"
css-inline = "0.8.5"
log = "0.4.27"
seed = { version = "0.10.0", features = ["routing"] }
#seed = "0.9.2"
console_log = { version = "0.1.4", registry = "xinu" }
serde = { version = "1.0.219", features = ["derive"] }
itertools = "0.14.0"
serde_json = { version = "1.0.140", features = ["unbounded_depth"] }
chrono = "0.4.40"
graphql_client = "0.14.0"
thiserror = "2.0.12"
gloo-net = { version = "0.6.0", features = ["json", "serde_json"] }
human_format = "1.1.0"
build-info = "0.0.41"
wasm-bindgen = "=0.2.100"
uuid = { version = "1.16.0", features = [
"js",
] } # direct dep to set js feature, prevents Rng issues
letterbox-shared = { path = "../shared/", version = "0.17.40", registry = "xinu" }
seed_hooks = { version = "0.4.1", registry = "xinu" }
strum_macros = "0.27.1"
gloo-console = "0.3.0"
[target.'cfg(target_arch = "wasm32")'.dependencies]
wasm-sockets = "1.0.0"
[package.metadata.wasm-pack.profile.release]
wasm-opt = ['-Os']
[dependencies.web-sys]
version = "0.3.58"
version = "0.3.77"
features = [
"Clipboard",
"DomRect",
"Element",
"History",
"MediaQueryList",
"Window"
"Navigator",
"Performance",
"ScrollRestoration",
"Window",
]

View File

@@ -1,6 +0,0 @@
.PHONY: all
# Build in release mode and push to minio for serving.
all:
trunk build --release
mc mirror --overwrite --remove dist/ m/letterbox/

View File

@@ -1,5 +1,5 @@
[build]
release = true
release = false
[serve]
# The address to serve on.
@@ -7,5 +7,21 @@ address = "0.0.0.0"
port = 6758
[[proxy]]
backend = "http://localhost:9345/"
rewrite= "/api/"
ws = true
backend = "ws://localhost:9345/api/ws"
[[proxy]]
backend = "http://localhost:9345/api/"
[[proxy]]
backend = "http://localhost:9345/notification/"
[[hooks]]
stage = "pre_build"
command = "printf"
command_arguments = ["\\033c"]
#[[hooks]]
#stage = "pre_build"
#command = "cargo"
#command_arguments = [ "test" ]

5
web/build.rs Normal file
View File

@@ -0,0 +1,5 @@
fn main() {
// Calling `build_info_build::build_script` collects all data and makes it available to `build_info::build_info!`
// and `build_info::format!` in the main program.
build_info_build::build_script();
}

View File

@@ -0,0 +1,3 @@
mutation AddTagMutation($query: String!, $tag: String!) {
tagAdd(query:$query, tag:$tag)
}

View File

@@ -0,0 +1,3 @@
query CatchupQuery($query: String!) {
catchup(query: $query)
}

View File

@@ -0,0 +1,27 @@
query FrontPageQuery($query: String!, $after: String $before: String, $first: Int, $last: Int) {
count(query: $query)
search(query: $query, after: $after, before: $before, first: $first, last: $last) {
pageInfo {
hasPreviousPage
hasNextPage
startCursor
endCursor
}
nodes {
thread
total
timestamp
subject
authors
tags
corpus
}
}
tags {
name
bgColor
fgColor
unread
}
version
}

Some files were not shown because too many files have changed in this diff Show More