2 Commits

Author SHA1 Message Date
Ricardo
db10d9cfbf fix(og): batch spawning to prevent OOM during watcher rebuilds
Full OG regeneration (2,350 images) OOM-kills when the Eleventy watcher
is running (~1.8 GB RSS), leaving only ~1.2 GB headroom in the 3 GB
container. WASM native memory from Satori/Resvg grows beyond what GC
can reclaim within a single process.

Solution: spawn og-cli in batches of 100 images. Each invocation exits
after its batch, fully releasing all WASM native memory. Exit code 2
signals "more work remains" and the spawner re-loops. Peak memory per
batch stays under ~500 MB regardless of total image count.

Also seed newManifest from existing manifest so unscanned entries survive
batch writes.

Confab-Link: http://localhost:8080/sessions/edb1b7b0-da66-4486-bd9c-d1cfa7553b88
2026-03-09 17:37:17 +01:00
Ricardo
86cbc1ee5d fix: run OG image generation in subprocess to prevent OOM kill
The OG generation in the eleventy.before hook consumed too much memory
alongside Eleventy's data cascade, causing the Eleventy process to be
OOM-killed on Cloudron. Fix by running OG generation in a separate
child process with its own 768MB heap limit. Also write the manifest
incrementally (every 10 images) to preserve progress if interrupted.
2026-02-18 08:49:23 +01:00