All checks were successful
Build and Deploy / Build & Push (push) Successful in 14m31s
Cleaner pipeline using official Docker actions. Supports semver tags alongside latest. Registry driven by vars.REGISTRY variable. Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
138 lines
3.9 KiB
Markdown
138 lines
3.9 KiB
Markdown
# Six Flags Super Calendar
|
|
|
|
A week-by-week calendar showing operating hours for all Six Flags Entertainment Group theme parks — including the former Cedar Fair parks. Data is scraped from the Six Flags internal API and stored locally in SQLite.
|
|
|
|
## Parks
|
|
|
|
24 theme parks across the US, Canada, and Mexico:
|
|
|
|
**Six Flags branded** — Great Adventure (NJ), Magic Mountain (CA), Great America (IL), Over Georgia, Over Texas, St. Louis, Fiesta Texas (TX), New England (MA), Discovery Kingdom (CA), Mexico, Great Escape (NY), Darien Lake (NY), Frontier City (OK)
|
|
|
|
**Former Cedar Fair** — Cedar Point (OH), Knott's Berry Farm (CA), Canada's Wonderland (ON), Carowinds (NC), Kings Dominion (VA), Kings Island (OH), Valleyfair (MN), Worlds of Fun (MO), Michigan's Adventure (MI), Dorney Park (PA), California's Great America (CA)
|
|
|
|
## Tech Stack
|
|
|
|
- **Next.js 15** (App Router, Server Components, standalone output)
|
|
- **Tailwind CSS v4** (`@theme {}` CSS variables, no config file)
|
|
- **SQLite** via `better-sqlite3` — persisted in `/app/data/parks.db`
|
|
- **Playwright** — one-time headless browser run to discover each park's internal API ID
|
|
- **Six Flags CloudFront API** — `https://d18car1k0ff81h.cloudfront.net/operating-hours/park/{id}?date=YYYYMM`
|
|
|
|
## Local Development
|
|
|
|
### Prerequisites
|
|
|
|
- Node.js 22+
|
|
- npm
|
|
|
|
### Setup
|
|
|
|
```bash
|
|
npm install
|
|
npx playwright install chromium
|
|
```
|
|
|
|
### Seed the database
|
|
|
|
Run once to discover each park's internal API ID (opens a headless browser per park):
|
|
|
|
```bash
|
|
npm run discover
|
|
```
|
|
|
|
Then scrape operating hours for the full year:
|
|
|
|
```bash
|
|
npm run scrape
|
|
```
|
|
|
|
To force a full re-scrape (ignores the 7-day staleness window):
|
|
|
|
```bash
|
|
npm run scrape:force
|
|
```
|
|
|
|
### Run the dev server
|
|
|
|
```bash
|
|
npm run dev
|
|
```
|
|
|
|
Open [http://localhost:3000](http://localhost:3000). Navigate weeks with the `←` / `→` buttons or pass `?week=YYYY-MM-DD` directly.
|
|
|
|
---
|
|
|
|
## Deployment
|
|
|
|
### Docker (standalone)
|
|
|
|
The app uses Next.js standalone output. The SQLite database is stored in a Docker volume at `/app/data`.
|
|
|
|
#### Run
|
|
|
|
```bash
|
|
docker compose up -d
|
|
```
|
|
|
|
#### Seed the database inside the container
|
|
|
|
The production image includes Playwright and Chromium, so discovery and scraping can be run directly against the running container's volume.
|
|
|
|
```bash
|
|
# Discover API IDs for all parks (one-time, opens headless browser per park)
|
|
docker compose exec web npm run discover
|
|
|
|
# Scrape operating hours for the full year
|
|
docker compose exec web npm run scrape
|
|
```
|
|
|
|
Or as one-off containers against the named volume:
|
|
|
|
```bash
|
|
docker run --rm -v sixflagssupercalendar_park_data:/app/data \
|
|
gitea.thewrightserver.net/josh/sixflagssupercalendar:latest \
|
|
npm run discover
|
|
|
|
docker run --rm -v sixflagssupercalendar_park_data:/app/data \
|
|
gitea.thewrightserver.net/josh/sixflagssupercalendar:latest \
|
|
npm run scrape
|
|
```
|
|
|
|
---
|
|
|
|
### CI/CD (Gitea Actions)
|
|
|
|
The pipeline is defined at [`.gitea/workflows/deploy.yml`](.gitea/workflows/deploy.yml).
|
|
|
|
**Trigger:** Push to `main`
|
|
|
|
**Steps:**
|
|
1. Checkout code
|
|
2. Log in to the Gitea container registry
|
|
3. Build and tag the image as `:latest` and `:<short-sha>`
|
|
4. Push both tags
|
|
|
|
#### Required configuration in Gitea
|
|
|
|
| Type | Name | Value |
|
|
|------|------|-------|
|
|
| Variable | `REGISTRY` | Registry hostname — `gitea.thewrightserver.net` |
|
|
| Secret | `REGISTRY_TOKEN` | A Gitea access token with `package:write` scope |
|
|
|
|
Set these under **Repository → Settings → Actions → Variables / Secrets**.
|
|
|
|
#### Upstream remote
|
|
|
|
```bash
|
|
git remote add origin https://gitea.thewrightserver.net/josh/SixFlagsSuperCalendar.git
|
|
git push -u origin main
|
|
```
|
|
|
|
---
|
|
|
|
## Data Refresh
|
|
|
|
The scrape job skips any park+month combination scraped within the last 7 days. To keep data current, run `npm run scrape` (or `scrape:force`) on a schedule — weekly is sufficient for a season calendar.
|
|
|
|
Parks and months not yet in the database show a `—` placeholder in the UI. Parks with no hours data on a given day show "Closed".
|