- Dockerfile: replace single runner stage with web + scraper named targets
- web: Next.js standalone only — no playwright, tsx, or scripts
- scraper: scripts/lib/node_modules/playwright only — no Next.js output
- docker-compose.yml: each service pulls its dedicated image tag
- .gitea/workflows/deploy.yml: build both targets on push to main
- lib/db.ts: STALE_AFTER_MS reads PARK_HOURS_STALENESS_HOURS env var (default 72h)
- lib/park-meta.ts: COASTER_STALE_MS reads COASTER_STALENESS_HOURS env var (default 720h)
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
Scraper automation (docker-compose):
- Add scraper service to docker-compose.yml using the same image and
shared park_data volume; overrides CMD to run scrape-schedule.sh
- scripts/scrape-schedule.sh: runs an initial scrape on container start,
then sleeps until 3:00 AM (respects TZ env var) and repeats nightly;
logs timestamps and next-run countdown; non-fatal on scrape errors
Staleness window: 7 days → 72 hours in lib/db.ts so data refreshes
more frequently with the automated schedule in place
Remove favicon: delete app/icon.tsx and public/logo.svg
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
docker-compose no longer needs REGISTRY_URL env var.
README now uses the actual registry host throughout.
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
REGISTRY_URL var was empty so docker login fell through to Docker Hub.
Now strips protocol from gitea.server_url to get the registry hostname —
no manual variable needed. docker-compose defaults to the known host.
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
Next.js 15 + Tailwind CSS v4 week calendar showing Six Flags park hours.
Scrapes the internal CloudFront API, stores results in SQLite.
Includes Dockerfile (Debian/Playwright-compatible), docker-compose, and
Gitea Actions pipeline that builds and pushes to the container registry.
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>