feat: automated nightly scraper + housekeeping
All checks were successful
Build and Deploy / Build & Push (push) Successful in 3m11s
All checks were successful
Build and Deploy / Build & Push (push) Successful in 3m11s
Scraper automation (docker-compose): - Add scraper service to docker-compose.yml using the same image and shared park_data volume; overrides CMD to run scrape-schedule.sh - scripts/scrape-schedule.sh: runs an initial scrape on container start, then sleeps until 3:00 AM (respects TZ env var) and repeats nightly; logs timestamps and next-run countdown; non-fatal on scrape errors Staleness window: 7 days → 72 hours in lib/db.ts so data refreshes more frequently with the automated schedule in place Remove favicon: delete app/icon.tsx and public/logo.svg Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
This commit is contained in:
@@ -9,5 +9,15 @@ services:
|
||||
- NODE_ENV=production
|
||||
restart: unless-stopped
|
||||
|
||||
scraper:
|
||||
image: gitea.thewrightserver.net/josh/sixflagssupercalendar:latest
|
||||
volumes:
|
||||
- park_data:/app/data
|
||||
environment:
|
||||
- NODE_ENV=production
|
||||
- TZ=America/New_York # set your local timezone so "3am" is 3am your time
|
||||
command: sh /app/scripts/scrape-schedule.sh
|
||||
restart: unless-stopped
|
||||
|
||||
volumes:
|
||||
park_data:
|
||||
|
||||
Reference in New Issue
Block a user