feat: split web and scraper into separate Docker images
All checks were successful
Build and Deploy / Build & Push (push) Successful in 3m4s
All checks were successful
Build and Deploy / Build & Push (push) Successful in 3m4s
- Dockerfile: replace single runner stage with web + scraper named targets - web: Next.js standalone only — no playwright, tsx, or scripts - scraper: scripts/lib/node_modules/playwright only — no Next.js output - docker-compose.yml: each service pulls its dedicated image tag - .gitea/workflows/deploy.yml: build both targets on push to main - lib/db.ts: STALE_AFTER_MS reads PARK_HOURS_STALENESS_HOURS env var (default 72h) - lib/park-meta.ts: COASTER_STALE_MS reads COASTER_STALENESS_HOURS env var (default 720h) Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
This commit is contained in:
@@ -1,6 +1,6 @@
|
||||
services:
|
||||
web:
|
||||
image: gitea.thewrightserver.net/josh/sixflagssupercalendar:latest
|
||||
image: gitea.thewrightserver.net/josh/sixflagssupercalendar:web
|
||||
ports:
|
||||
- "3000:3000"
|
||||
volumes:
|
||||
@@ -10,13 +10,14 @@ services:
|
||||
restart: unless-stopped
|
||||
|
||||
scraper:
|
||||
image: gitea.thewrightserver.net/josh/sixflagssupercalendar:latest
|
||||
image: gitea.thewrightserver.net/josh/sixflagssupercalendar:scraper
|
||||
volumes:
|
||||
- park_data:/app/data
|
||||
environment:
|
||||
- NODE_ENV=production
|
||||
- TZ=America/New_York # set your local timezone so "3am" is 3am your time
|
||||
command: sh /app/scripts/scrape-schedule.sh
|
||||
- TZ=America/New_York
|
||||
- PARK_HOURS_STALENESS_HOURS=72
|
||||
- COASTER_STALENESS_HOURS=720
|
||||
restart: unless-stopped
|
||||
|
||||
volumes:
|
||||
|
||||
Reference in New Issue
Block a user