146 Commits
v1.1.0 ... dev

Author SHA1 Message Date
fb3c6405c9 Merge pull request 'feat/sort-instances' (#69) from feat/sort-instances into dev
All checks were successful
CI / test (push) Successful in 9s
CI / build-dev (push) Successful in 25s
Reviewed-on: #69
2026-03-29 08:46:15 -04:00
Josh Wright
b6ca460ac6 feat: add sort by vmid, name, last created, last updated on dashboard
All checks were successful
CI / test (pull_request) Successful in 9s
CI / build-dev (pull_request) Has been skipped
- GET /api/instances now accepts ?sort= (name|vmid|created_at|updated_at)
  and ?order= (asc|desc); invalid sort fields fall back to name asc
- Timestamp sorts use id as a tiebreaker (datetime() precision is 1 s)
- Toolbar gains a sort-field <select> and a ↑/↓ direction toggle button
- toggleSortDir() flips direction and re-fetches; state held in data-dir attr

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-29 08:26:46 -04:00
Josh Wright
8312701147 test: add failing tests for sort/order on GET /api/instances
Tests cover:
- sort by vmid asc/desc
- sort by name desc
- sort by created_at asc/desc (id tiebreaker for same-second inserts)
- sort by updated_at asc/desc (id tiebreaker for same-second inserts)
- invalid sort field falls back to name asc

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-29 08:25:53 -04:00
edf6f674b3 Merge pull request 'v1.6.0' (#65) from dev into main
All checks were successful
CI / test (push) Successful in 15s
Release / release (push) Successful in 52s
CI / build-dev (push) Has been skipped
Reviewed-on: #65
2026-03-28 21:01:27 -04:00
a8d367b4be Merge pull request 'chore: bump to version 1.6.0' (#64) from chore/bump-v1.6.0 into dev
All checks were successful
CI / test (push) Successful in 16s
CI / build-dev (push) Successful in 44s
CI / test (pull_request) Successful in 14s
CI / build-dev (pull_request) Has been skipped
Reviewed-on: #64
2026-03-28 20:59:21 -04:00
5ca0b648ca chore: bump to version 1.6.0
All checks were successful
CI / test (pull_request) Successful in 18s
CI / build-dev (pull_request) Has been skipped
2026-03-28 20:58:41 -04:00
518ed42f60 Merge pull request 'feat: make stats bar cells clickable to filter by state' (#62) from feat/jobs-system into dev
All checks were successful
CI / test (push) Successful in 17s
CI / build-dev (push) Successful in 27s
Reviewed-on: #62
2026-03-28 20:53:14 -04:00
a9147b0198 Merge branch 'dev' into feat/jobs-system
All checks were successful
CI / test (pull_request) Successful in 14s
CI / build-dev (pull_request) Has been skipped
2026-03-28 20:52:44 -04:00
2e3484b1d9 feat: make stats bar cells clickable to filter by state
All checks were successful
CI / test (pull_request) Successful in 14s
CI / build-dev (pull_request) Has been skipped
Clicking deployed/testing/degraded sets the state filter to that
value. Clicking total clears all filters. Hover style added.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-28 20:51:31 -04:00
cb83d11261 Merge pull request 'fix: config is already a parsed object from the jobs API response' (#61) from feat/jobs-system into dev
All checks were successful
CI / test (push) Successful in 14s
CI / build-dev (push) Successful in 26s
Reviewed-on: #61
2026-03-28 20:47:46 -04:00
047fd0653e Merge branch 'dev' into feat/jobs-system
All checks were successful
CI / test (pull_request) Successful in 13s
CI / build-dev (pull_request) Has been skipped
2026-03-28 20:47:18 -04:00
027ed52768 fix: config is already a parsed object from the jobs API response
All checks were successful
CI / test (pull_request) Successful in 14s
CI / build-dev (pull_request) Has been skipped
maskJob parses job.config before returning it, so calling JSON.parse
on it again threw an exception. The catch returned false for every
job, so relevant was always empty and _waitForOnCreateJobs returned
immediately without polling.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-28 20:46:49 -04:00
e2935c58c8 Merge pull request 'fix: capture job baseline before POST to avoid race condition' (#60) from feat/jobs-system into dev
All checks were successful
CI / test (push) Successful in 16s
CI / build-dev (push) Successful in 29s
Reviewed-on: #60
2026-03-28 20:43:26 -04:00
1bbe743dba fix: capture job baseline before POST to avoid race condition
All checks were successful
CI / test (pull_request) Successful in 14s
CI / build-dev (pull_request) Has been skipped
The previous version snapshotted last_run_id after the 201 response,
but jobs fire immediately server-side — by the time the client fetched
/api/jobs the runs were already complete, so the baseline matched the
new state and the poll loop never detected completion.

Baseline is now captured before the creation POST so it always
reflects pre-run state regardless of job speed.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-28 20:42:46 -04:00
d88b79e9f0 Merge pull request 'feat: auto-refresh UI after on-create jobs complete' (#59) from feat/jobs-system into dev
All checks were successful
CI / test (push) Successful in 14s
CI / build-dev (push) Successful in 29s
Reviewed-on: #59
2026-03-28 20:26:26 -04:00
8a9de6d72a Merge branch 'dev' into feat/jobs-system
All checks were successful
CI / test (pull_request) Successful in 15s
CI / build-dev (pull_request) Has been skipped
2026-03-28 20:25:55 -04:00
ddd528a682 feat: auto-refresh UI after on-create jobs complete
All checks were successful
CI / test (pull_request) Successful in 14s
CI / build-dev (pull_request) Has been skipped
After creating an instance, if any jobs have run_on_create enabled,
the client polls /api/jobs every 500ms until each relevant job has a
new completed run (tracked via last_run_id baseline). The dashboard
or detail page then refreshes automatically. 30s timeout as a safety
net if a job hangs.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-28 20:25:26 -04:00
03cf2aa9c6 Merge pull request 'fix: millisecond precision timestamps and correct history ordering' (#58) from feat/jobs-system into dev
All checks were successful
CI / test (push) Successful in 14s
CI / build-dev (push) Successful in 29s
Reviewed-on: #58
2026-03-28 20:20:42 -04:00
d84674b0c6 Merge branch 'dev' into feat/jobs-system
All checks were successful
CI / test (pull_request) Successful in 13s
CI / build-dev (pull_request) Has been skipped
2026-03-28 20:20:03 -04:00
7999f46ca2 fix: millisecond precision timestamps and correct history ordering
All checks were successful
CI / test (pull_request) Successful in 21s
CI / build-dev (pull_request) Has been skipped
datetime('now') only stores to the second, making same-second events
indistinguishable. Switched all instance_history and job_runs writes
to strftime('%Y-%m-%dT%H:%M:%f', 'now') for millisecond precision.

Reverted getInstanceHistory to ORDER BY changed_at DESC, id DESC so
newest events appear at the top and instance creation (lowest id,
earliest timestamp) is always at the bottom.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-28 20:19:42 -04:00
307c5cf9e8 Merge pull request 'fix: initialize jobs nav dot on every page load' (#57) from feat/jobs-system into dev
All checks were successful
CI / test (push) Successful in 14s
CI / build-dev (push) Successful in 28s
Reviewed-on: #57
2026-03-28 20:16:02 -04:00
34af8e5a8f Merge branch 'dev' into feat/jobs-system
All checks were successful
CI / test (pull_request) Successful in 14s
CI / build-dev (pull_request) Has been skipped
2026-03-28 20:15:37 -04:00
76d2bffb4f fix: initialize jobs nav dot on every page load
All checks were successful
CI / test (pull_request) Successful in 14s
CI / build-dev (pull_request) Has been skipped
Previously the dot only updated when visiting the Jobs page.
Now a jobs fetch runs at bootstrap so the dot reflects status
immediately on any page, including after a hard refresh.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-28 20:14:53 -04:00
64de0e432c Merge pull request 'fix: queue on-create jobs sequentially and fix history ordering' (#56) from feat/jobs-system into dev
All checks were successful
CI / test (push) Successful in 13s
CI / build-dev (push) Successful in 28s
Reviewed-on: #56
2026-03-28 20:12:31 -04:00
a5b409a348 Merge branch 'dev' into feat/jobs-system
All checks were successful
CI / test (pull_request) Successful in 14s
CI / build-dev (pull_request) Has been skipped
2026-03-28 20:09:59 -04:00
8f35724bde fix: queue on-create jobs sequentially and fix history ordering
All checks were successful
CI / test (pull_request) Successful in 13s
CI / build-dev (pull_request) Has been skipped
runJobsOnCreate now awaits each job before starting the next,
ensuring they don't stomp each other's DB writes in parallel.

getInstanceHistory changed to ORDER BY changed_at ASC, id ASC so
the creation event (lowest id) is always first regardless of
same-second timestamps.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-28 20:09:32 -04:00
cec82a3347 Merge pull request 'feat: run jobs on instance creation when run_on_create is enabled' (#54) from feat/jobs-system into dev
All checks were successful
CI / test (push) Successful in 17s
CI / build-dev (push) Successful in 34s
Reviewed-on: #54
2026-03-28 20:01:53 -04:00
883e59789b Merge branch 'dev' into feat/jobs-system
All checks were successful
CI / test (pull_request) Successful in 17s
CI / build-dev (pull_request) Has been skipped
2026-03-28 20:01:20 -04:00
817fdaef13 feat: run jobs on instance creation when run_on_create is enabled
All checks were successful
CI / test (pull_request) Successful in 18s
CI / build-dev (pull_request) Has been skipped
Jobs with run_on_create=true in their config fire automatically
after a new instance is created. Runs fire-and-forget so they don't
delay the 201 response. Option exposed as a checkbox in each job's
detail panel.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-28 20:00:45 -04:00
9295354e72 Merge pull request 'v1.5.0' (#53) from dev into main
All checks were successful
CI / test (push) Successful in 14s
Release / release (push) Successful in 47s
CI / build-dev (push) Has been skipped
Reviewed-on: #53
2026-03-28 19:51:29 -04:00
372cda6a58 Merge pull request 'chore: bump version to 1.5.0' (#52) from feat/jobs-system into dev
All checks were successful
CI / test (push) Successful in 14s
CI / build-dev (push) Successful in 38s
CI / test (pull_request) Successful in 13s
CI / build-dev (pull_request) Has been skipped
Reviewed-on: #52
2026-03-28 19:49:19 -04:00
3301e942ef Merge branch 'dev' into feat/jobs-system
All checks were successful
CI / test (pull_request) Successful in 14s
CI / build-dev (pull_request) Has been skipped
2026-03-28 19:48:48 -04:00
c4ebb76deb chore: bump version to 1.5.0
All checks were successful
CI / test (pull_request) Successful in 14s
CI / build-dev (pull_request) Has been skipped
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-28 19:48:16 -04:00
bb765453ab Merge pull request 'feat: include job config and run history in export/import backup' (#51) from feat/jobs-system into dev
All checks were successful
CI / test (push) Successful in 14s
CI / build-dev (push) Successful in 40s
Reviewed-on: #51
2026-03-28 19:44:37 -04:00
88474d1048 Merge branch 'dev' into feat/jobs-system
All checks were successful
CI / test (pull_request) Successful in 17s
CI / build-dev (pull_request) Has been skipped
2026-03-28 19:44:05 -04:00
954d85ca81 feat: include job config and run history in export/import backup
All checks were successful
CI / test (pull_request) Successful in 16s
CI / build-dev (pull_request) Has been skipped
Export bumped to version 3, now includes jobs (with raw unmasked
config) and job_runs arrays. Import restores them when present and
restarts the scheduler. Payloads without a jobs key leave jobs
untouched, keeping v1/v2 backups fully compatible.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-28 19:43:34 -04:00
117dfc5f17 Merge pull request 'feat: add Semaphore Sync job' (#50) from feat/jobs-system into dev
All checks were successful
CI / test (push) Successful in 15s
CI / build-dev (push) Successful in 27s
Reviewed-on: #50
2026-03-28 19:35:47 -04:00
c39c7a8aef Merge branch 'dev' into feat/jobs-system
All checks were successful
CI / test (pull_request) Successful in 19s
CI / build-dev (pull_request) Has been skipped
2026-03-28 19:35:10 -04:00
a934db1a14 feat: add Semaphore Sync job
All checks were successful
CI / test (pull_request) Successful in 15s
CI / build-dev (pull_request) Has been skipped
Fetches Semaphore project inventory via Bearer auth, parses the
Ansible INI format to extract hostnames, and sets semaphore=1/0
on matching instances.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-28 19:34:45 -04:00
ea4c5f7c95 Merge pull request 'feat: add Patchmon Sync job' (#49) from feat/jobs-system into dev
All checks were successful
CI / test (push) Successful in 14s
CI / build-dev (push) Successful in 25s
Reviewed-on: #49
2026-03-28 19:24:12 -04:00
5c12acb6c7 Merge branch 'dev' into feat/jobs-system
All checks were successful
CI / test (pull_request) Successful in 17s
CI / build-dev (pull_request) Has been skipped
2026-03-28 19:23:37 -04:00
0b350f3b28 feat: add Patchmon Sync job
All checks were successful
CI / test (pull_request) Successful in 14s
CI / build-dev (pull_request) Has been skipped
Syncs patchmon field on instances by querying the Patchmon hosts API
and matching hostnames. API token masked as REDACTED in responses.
seedJobs now uses INSERT OR IGNORE so new jobs are seeded on existing
installs without re-running the full seed.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-28 19:22:41 -04:00
db4071a2cf Merge pull request 'fix: move page-jobs inside main so it renders at the top' (#48) from feat/jobs-system into dev
All checks were successful
CI / test (push) Successful in 15s
CI / build-dev (push) Successful in 30s
Reviewed-on: #48
2026-03-28 19:15:38 -04:00
37cd77850e Merge branch 'dev' into feat/jobs-system
All checks were successful
CI / test (pull_request) Successful in 14s
CI / build-dev (pull_request) Has been skipped
2026-03-28 19:15:07 -04:00
14a4826bb6 fix: move page-jobs inside main so it renders at the top
All checks were successful
CI / test (pull_request) Successful in 14s
CI / build-dev (pull_request) Has been skipped
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-28 19:14:32 -04:00
550135ca37 Merge pull request 'feat: jobs system with dedicated nav page and run history' (#47) from feat/jobs-system into dev
All checks were successful
CI / test (push) Successful in 13s
CI / build-dev (push) Successful in 26s
Reviewed-on: #47
2026-03-28 19:10:50 -04:00
d7727badb1 feat: jobs system with dedicated nav page and run history
All checks were successful
CI / test (pull_request) Successful in 14s
CI / build-dev (pull_request) Has been skipped
Replaces ad-hoc Tailscale config tracking with a proper jobs system.
Jobs get their own nav page (master/detail layout), a dedicated DB
table, and full run history persisted forever. Tailscale connection
settings move from the Settings modal into the Jobs page. Registry
pattern makes adding future jobs straightforward.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-28 19:09:42 -04:00
537d78e71b Merge pull request 'feat: Tailscale sync jobs' (#46) from feat/tailscale-sync-jobs into dev
All checks were successful
CI / test (push) Successful in 13s
CI / build-dev (push) Successful in 25s
Reviewed-on: #46
2026-03-28 17:12:35 -04:00
47e9c4faf7 feat: Tailscale sync jobs
All checks were successful
CI / test (pull_request) Successful in 14s
CI / build-dev (pull_request) Has been skipped
Adds a background job system that polls the Tailscale API on a configurable
interval and syncs tailscale status and IPs to instances by hostname match.

- New config table (key/value) in SQLite for persistent server-side settings
- New server/jobs.js: runTailscaleSync + restartJobs scheduler
- GET/PUT /api/config — read and write Tailscale settings; API key masked as **REDACTED** on GET
- POST /api/jobs/tailscale/run — immediate manual sync
- Settings modal: new Tailscale Sync section with enable toggle, tailnet, API key, poll interval, Save + Run Now buttons, last-run status

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-28 17:11:40 -04:00
31a5090f4f Merge pull request 'fix: remove internal database ID from frontend' (#45) from fix/hide-internal-id into dev
All checks were successful
CI / test (push) Successful in 13s
CI / build-dev (push) Successful in 20s
Reviewed-on: #45
2026-03-28 16:48:19 -04:00
ecdac6fe23 fix: remove internal database ID from frontend
All checks were successful
CI / test (pull_request) Successful in 13s
CI / build-dev (pull_request) Has been skipped
Removed from the instance subtitle and the overview kv grid. The auto-
increment ID is an implementation detail with no user-facing meaning.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-28 16:47:20 -04:00
07cef73fae Merge pull request 'v1.4.0' (#44) from dev into main
All checks were successful
CI / test (push) Successful in 14s
Release / release (push) Successful in 39s
CI / build-dev (push) Has been skipped
Reviewed-on: #44
2026-03-28 16:16:46 -04:00
1a84edc064 Merge pull request 'chore: bump version to 1.4.0' (#43) from chore/bump-v1.4.0 into dev
All checks were successful
CI / test (push) Successful in 13s
CI / build-dev (push) Successful in 24s
CI / test (pull_request) Successful in 13s
CI / build-dev (pull_request) Has been skipped
Reviewed-on: #43
2026-03-28 16:15:32 -04:00
bfb2c26821 chore: bump version to 1.4.0
All checks were successful
CI / test (pull_request) Successful in 14s
CI / build-dev (pull_request) Has been skipped
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-28 16:09:08 -04:00
a985268987 Merge pull request 'feat: include history in export/import backup' (#42) from feat/export-import-history into dev
All checks were successful
CI / test (push) Successful in 12s
CI / build-dev (push) Successful in 21s
Reviewed-on: #42
2026-03-28 16:06:21 -04:00
218cdb08c5 feat: include history in export/import backup
All checks were successful
CI / test (pull_request) Successful in 15s
CI / build-dev (pull_request) Has been skipped
Export now returns version 2 with a history array alongside instances.
Import accepts the history array and restores all audit events. v1 backups
without a history key still import cleanly.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-28 16:04:53 -04:00
2855cc7f81 Merge pull request 'feat: mobile-responsive layout under 640px' (#41) from feat/mobile-responsive into dev
All checks were successful
CI / test (push) Successful in 13s
CI / build-dev (push) Successful in 23s
Reviewed-on: #41
2026-03-28 15:57:07 -04:00
07d2e215e4 Merge branch 'dev' into feat/mobile-responsive
All checks were successful
CI / test (pull_request) Successful in 13s
CI / build-dev (pull_request) Has been skipped
2026-03-28 15:56:38 -04:00
8ef839d6d0 Merge pull request 'fix: wrap image reference in backticks in release notes' (#40) from fix/release-image-codeblock into dev
All checks were successful
CI / test (push) Successful in 12s
CI / build-dev (push) Successful in 21s
Reviewed-on: #40
2026-03-28 15:55:05 -04:00
7af88328c8 feat: mobile-responsive layout under 640px
All checks were successful
CI / test (pull_request) Successful in 14s
CI / build-dev (pull_request) Has been skipped
Single breakpoint, no desktop changes. Key adjustments:
- Reset zoom: 1 (mobile browsers handle scaling)
- Padding drops from 32px to 16px throughout
- Toolbar wraps: search full-width, filters below
- Instance grid and detail grid collapse to single column
- Detail header stacks title above action buttons
- History timeline stacks timestamp above event
- Toggle grid drops from 3 to 2 columns
- Confirm box gets max-width: calc(100vw - 32px) to prevent overflow
- Toast stretches across bottom of screen

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-28 15:54:12 -04:00
096e2afb3d fix: wrap image reference in backticks in release notes
All checks were successful
CI / test (pull_request) Successful in 13s
CI / build-dev (pull_request) Has been skipped
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-28 15:44:15 -04:00
e3d089a71f Merge pull request 'v1.3.1' (#39) from dev into main
All checks were successful
CI / test (push) Successful in 14s
Release / release (push) Successful in 42s
CI / build-dev (push) Has been skipped
Reviewed-on: #39
2026-03-28 15:42:00 -04:00
668e7c34bb Merge pull request 'chore: release v1.3.1' (#38) from chore/bump-v1.3.1 into dev
All checks were successful
CI / test (push) Successful in 15s
CI / build-dev (push) Successful in 25s
CI / test (pull_request) Successful in 14s
CI / build-dev (pull_request) Has been skipped
Reviewed-on: #38
2026-03-28 15:40:46 -04:00
e796b4f400 chore: release v1.3.1
All checks were successful
CI / test (pull_request) Successful in 14s
CI / build-dev (pull_request) Has been skipped
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-28 15:40:16 -04:00
a4b5c20993 Merge pull request 'fix: clear instance history on delete and import' (#37) from fix/delete-clears-history into dev
All checks were successful
CI / test (push) Successful in 13s
CI / build-dev (push) Successful in 20s
Reviewed-on: #37
2026-03-28 15:38:15 -04:00
d17f364fc5 fix: clear instance history on delete and import
All checks were successful
CI / test (pull_request) Successful in 14s
CI / build-dev (pull_request) Has been skipped
deleteInstance now removes history rows for that vmid before removing
the instance. importInstances clears all history before replacing
instances. Prevents stale history appearing when a vmid is reused.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-28 15:37:45 -04:00
5f79eec3dd Merge pull request 'fix: categorize release notes into New Features / Bug Fixes, drop chores' (#36) from fix/release-notes-format into dev
All checks were successful
CI / test (push) Successful in 12s
CI / build-dev (push) Successful in 20s
Reviewed-on: #36
2026-03-28 15:36:27 -04:00
ed98bb57c0 fix: categorize release notes into New Features / Bug Fixes, drop chores
All checks were successful
CI / test (pull_request) Successful in 13s
CI / build-dev (pull_request) Has been skipped
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-28 15:35:53 -04:00
120b61a423 Merge pull request 'v1.3.0' (#35) from dev into main
All checks were successful
CI / test (push) Successful in 14s
Release / release (push) Successful in 40s
CI / build-dev (push) Has been skipped
Reviewed-on: #35
2026-03-28 15:31:57 -04:00
074f0600af Merge pull request 'chore: release v1.3.0' (#34) from chore/bump-v1.3.0 into dev
All checks were successful
CI / test (push) Successful in 14s
CI / build-dev (push) Successful in 21s
CI / test (pull_request) Successful in 14s
CI / build-dev (pull_request) Has been skipped
Reviewed-on: #34
2026-03-28 15:30:02 -04:00
e4f9407827 Merge branch 'dev' into chore/bump-v1.3.0
All checks were successful
CI / test (pull_request) Successful in 13s
CI / build-dev (pull_request) Has been skipped
2026-03-28 15:29:09 -04:00
fde5ce7dc1 Merge pull request 'chore: release v1.3.0' (#33) from chore/bump-v1.3.0 into dev
All checks were successful
CI / test (push) Successful in 13s
CI / build-dev (push) Successful in 28s
Reviewed-on: #33
2026-03-28 15:28:44 -04:00
20df10b333 chore: release v1.3.0
All checks were successful
CI / test (pull_request) Successful in 13s
CI / build-dev (pull_request) Has been skipped
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-28 15:28:39 -04:00
c906511bfc chore: release v1.3.0
All checks were successful
CI / test (pull_request) Successful in 13s
CI / build-dev (pull_request) Has been skipped
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-28 15:27:30 -04:00
745e5920ad Merge pull request 'fix: set html zoom 1.1 so default scale matches browser 110%' (#32) from fix/base-zoom into dev
All checks were successful
CI / test (push) Successful in 12s
CI / build-dev (push) Successful in 20s
Reviewed-on: #32
2026-03-28 15:25:32 -04:00
90e0a98914 fix: set html zoom 1.1 so default scale matches browser 110%
All checks were successful
CI / test (pull_request) Successful in 13s
CI / build-dev (pull_request) Has been skipped
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-28 15:24:58 -04:00
cba4b73798 Merge pull request 'fix: use badge for stack on detail overview, consistent across all views' (#31) from fix/detail-stack-badge into dev
All checks were successful
CI / test (push) Successful in 14s
CI / build-dev (push) Successful in 21s
Reviewed-on: #31
2026-03-28 15:22:13 -04:00
0d567472a9 fix: use badge for stack on detail overview, consistent across all views
All checks were successful
CI / test (pull_request) Successful in 12s
CI / build-dev (pull_request) Has been skipped
stack was plain highlighted text on the detail page but a coloured badge
on the home cards and in the history timeline. Now all three use the same
badge component.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-28 15:21:25 -04:00
9f6b2ece52 Merge pull request 'fix: parse SQLite timestamps as UTC, not local time' (#30) from fix/sqlite-utc-timestamps into dev
All checks were successful
CI / test (push) Successful in 12s
CI / build-dev (push) Successful in 20s
Reviewed-on: #30
2026-03-28 15:20:15 -04:00
e3911157e9 fix: parse SQLite timestamps as UTC, not local time
All checks were successful
CI / test (pull_request) Successful in 14s
CI / build-dev (pull_request) Has been skipped
SQLite datetime('now') returns 'YYYY-MM-DD HH:MM:SS' with no timezone
marker. JS was treating this as local time, so timestamps showed the
correct UTC digits but with the local TZ abbreviation attached (e.g.
'7:15 PM EDT' when the real local time was '3:15 PM EDT').

Add parseUtc() which appends 'Z' before parsing any string that has no
existing timezone marker, ensuring JS always treats them as UTC before
the display-timezone conversion is applied.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-28 15:19:35 -04:00
0589288dfe Merge pull request 'fix: populate nav instance count on direct detail page load/refresh' (#29) from fix/detail-nav-count into dev
All checks were successful
CI / test (push) Successful in 13s
CI / build-dev (push) Successful in 21s
Reviewed-on: #29
2026-03-28 15:16:32 -04:00
8ead7687e5 fix: populate nav instance count on direct detail page load/refresh
All checks were successful
CI / test (pull_request) Successful in 13s
CI / build-dev (pull_request) Has been skipped
nav-count was only set in renderDashboard, so loading /instance/:vmid
directly left it showing "—". Add getInstances() to the parallel fetch
in renderDetailPage and set the count there too.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-28 15:15:55 -04:00
0e1e9b6699 Merge pull request 'fix: show stack badge in history timeline, matching state treatment' (#28) from fix/history-stack-badge into dev
All checks were successful
CI / test (push) Successful in 13s
CI / build-dev (push) Successful in 22s
Reviewed-on: #28
2026-03-28 15:14:02 -04:00
3c008c5bce fix: show stack badge in history timeline, matching state treatment
All checks were successful
CI / test (pull_request) Successful in 13s
CI / build-dev (pull_request) Has been skipped
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-28 15:13:27 -04:00
1582c28b28 Merge pull request 'fix: clean up instance detail subtitle — dividers, readable values' (#27) from feat/timezone-settings into dev
All checks were successful
CI / test (push) Successful in 12s
CI / build-dev (push) Successful in 21s
Reviewed-on: #27
2026-03-28 15:10:54 -04:00
bcd934f5b1 Merge branch 'dev' into feat/timezone-settings
All checks were successful
CI / test (pull_request) Successful in 12s
CI / build-dev (pull_request) Has been skipped
2026-03-28 15:10:26 -04:00
4c9acd20c7 fix: clean up instance detail subtitle — dividers, readable values
All checks were successful
CI / test (pull_request) Successful in 13s
CI / build-dev (pull_request) Has been skipped
Separate vmid / id / created with a subtle vertical border so they
don't run together. Bump font to 13px. Labels drop to 11px muted,
values use full --text colour so the actual data stands out clearly.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-28 15:10:05 -04:00
520fb98d96 Merge pull request 'feat: redesign history timeline — single-line, timestamp right-aligned' (#26) from feat/timezone-settings into dev
All checks were successful
CI / test (push) Successful in 12s
CI / build-dev (push) Successful in 19s
Reviewed-on: #26
2026-03-28 15:07:01 -04:00
800184d2be Merge branch 'dev' into feat/timezone-settings
All checks were successful
CI / test (pull_request) Successful in 12s
CI / build-dev (pull_request) Has been skipped
2026-03-28 15:06:34 -04:00
82c314f85c feat: redesign history timeline — single-line, timestamp right-aligned
All checks were successful
CI / test (pull_request) Successful in 13s
CI / build-dev (pull_request) Has been skipped
Each event is now one row: label · old → new on the left, timestamp
right-aligned. Nothing is far from anything else. State changes use the
existing badge component for immediate visual recognition. The created
event reads 'instance created' in accent. Middle-dot separator keeps
field label and change value clearly associated without forced spacing.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-28 15:06:09 -04:00
2fba532ec7 Merge pull request 'feat: rework history timeline for clarity' (#25) from feat/timezone-settings into dev
All checks were successful
CI / test (push) Successful in 13s
CI / build-dev (push) Successful in 20s
Reviewed-on: #25
2026-03-28 15:01:39 -04:00
9177578aaf Merge branch 'dev' into feat/timezone-settings
All checks were successful
CI / test (pull_request) Successful in 13s
CI / build-dev (pull_request) Has been skipped
2026-03-28 15:01:12 -04:00
94c4a0af51 feat: rework history timeline for clarity
All checks were successful
CI / test (pull_request) Successful in 12s
CI / build-dev (pull_request) Has been skipped
Timestamp now sits on its own line above each event so it's visually
separate from the change description. Field names use a friendly label
map (hardware_acceleration → hw acceleration, tailscale_ip → tailscale ip,
etc.). The created event reads "instance created" in accent colour instead
of a raw "created / —". Padding between rows increased.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-28 15:00:22 -04:00
ec60d53767 Merge pull request 'feat: timezone setting — display timestamps in selected local timezone' (#24) from feat/timezone-settings into dev
All checks were successful
CI / test (push) Successful in 12s
CI / build-dev (push) Successful in 19s
Reviewed-on: #24
2026-03-28 14:56:01 -04:00
ad81d7ace7 Merge branch 'dev' into feat/timezone-settings
All checks were successful
CI / test (pull_request) Successful in 12s
CI / build-dev (pull_request) Has been skipped
2026-03-28 14:55:35 -04:00
badd542bd7 feat: timezone setting — display timestamps in selected local timezone
All checks were successful
CI / test (pull_request) Successful in 13s
CI / build-dev (pull_request) Has been skipped
Add a Display section to the settings modal with a timezone dropdown.
Selection is persisted to localStorage and applied to all timestamps via
fmtDate (date-only) and fmtDateFull (date + time + TZ abbreviation, e.g.
"Mar 28, 2026, 2:48 PM EDT"). Changing the timezone live-re-renders the
current page. Defaults to UTC.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-28 14:53:20 -04:00
7c31ee3327 Merge pull request 'chore: maintenance — test coverage, route cleanup, README rewrite' (#23) from chore/maintenance into dev
All checks were successful
CI / test (push) Successful in 13s
CI / build-dev (push) Successful in 23s
Reviewed-on: #23
2026-03-28 14:47:27 -04:00
0ecfa7dbc9 chore: maintenance — test coverage, route cleanup, README rewrite
All checks were successful
CI / test (pull_request) Successful in 13s
CI / build-dev (pull_request) Has been skipped
- Add fmtHistVal and stateClass helper tests (7 new, 106 total)
- Add import regression test: missing name field returns 400 not 500
- Fix normalise() crash on missing name: body.name.trim() → (body.name ?? '').trim()
- Extract duplicate DB error handler into handleDbError() helper
- Rewrite README from scratch with audit log, export/import, full API docs

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-28 14:46:48 -04:00
f16fb3e088 Merge pull request 'feat: audit log / history timeline on instance detail page' (#22) from feat/history-timeline into dev
All checks were successful
CI / test (push) Successful in 17s
CI / build-dev (push) Successful in 22s
Reviewed-on: #22
2026-03-28 14:36:22 -04:00
cb01573cdf feat: audit log / history timeline on instance detail page
All checks were successful
CI / test (pull_request) Successful in 13s
CI / build-dev (pull_request) Has been skipped
Adds an instance_history table that records every field change:
- createInstance logs a 'created' event
- updateInstance diffs old vs new and logs one row per changed field
  (name, state, stack, vmid, tailscale_ip, all service flags)
- History is stored under the new vmid when vmid changes

New endpoint: GET /api/instances/:vmid/history

The 'timestamps' section on the detail page is replaced with a
grid timeline showing timestamp | field | old → new for each event.
State changes are colour-coded (deployed=green, testing=amber,
degraded=red). Boolean service flags display as on/off.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-28 14:35:35 -04:00
b48d5fb836 Merge pull request 'fix: remove stacks count from stats bar' (#21) from fix/remove-stacks-stat into dev
All checks were successful
CI / test (push) Successful in 13s
CI / build-dev (push) Successful in 20s
Reviewed-on: #21
2026-03-28 14:28:19 -04:00
6e124576cb fix: remove stacks count from stats bar
All checks were successful
CI / test (pull_request) Successful in 12s
CI / build-dev (pull_request) Has been skipped
Stacks are always just production/development — counting them adds
no useful information to the dashboard summary.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-28 14:27:43 -04:00
1f328e026d Merge pull request 'fix: uniform 16px spacing above all settings sections' (#20) from fix/settings-section-spacing into dev
All checks were successful
CI / test (push) Successful in 14s
CI / build-dev (push) Successful in 22s
Reviewed-on: #20
2026-03-28 14:24:09 -04:00
71c2c68fbc fix: uniform 16px spacing above all settings sections
All checks were successful
CI / test (pull_request) Successful in 12s
CI / build-dev (pull_request) Has been skipped
Removing the :first-child { padding-top: 0 } override lets every
section use the same padding: 16px 0, so the gap above Export matches
the gap above Import (and any future sections).

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-28 14:23:14 -04:00
8bcf8229db Merge pull request 'fix: remove top padding from settings modal body' (#19) from fix/settings-modal-body-padding into dev
All checks were successful
CI / test (push) Successful in 17s
CI / build-dev (push) Successful in 21s
Reviewed-on: #19
2026-03-28 14:20:20 -04:00
6e1e9f7153 fix: remove top padding from settings modal body
All checks were successful
CI / test (pull_request) Successful in 14s
CI / build-dev (pull_request) Has been skipped
The modal-body's 22px padding-top created a visible gap between the
header divider and the Export section title.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-28 14:19:39 -04:00
1fbb74d1ef Merge pull request 'fix: remove top gap above first settings section' (#18) from feat/settings-modal into dev
All checks were successful
CI / test (push) Successful in 13s
CI / build-dev (push) Successful in 23s
Reviewed-on: #18
2026-03-28 14:16:22 -04:00
617a5b5800 Merge branch 'dev' into feat/settings-modal
All checks were successful
CI / test (pull_request) Successful in 13s
CI / build-dev (pull_request) Has been skipped
2026-03-28 14:15:57 -04:00
0985d9d481 fix: remove top gap above first settings section
All checks were successful
CI / test (pull_request) Successful in 13s
CI / build-dev (pull_request) Has been skipped
padding-top on the first .settings-section created a visible gap
above the Export title. Fixed with :first-child { padding-top: 0 }.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-28 14:15:25 -04:00
2af6c56558 Merge pull request 'feat: settings modal with database export and import' (#17) from feat/settings-modal into dev
All checks were successful
CI / test (push) Successful in 13s
CI / build-dev (push) Successful in 21s
Reviewed-on: #17
2026-03-28 14:12:08 -04:00
af207339a4 feat: settings modal with database export and import
All checks were successful
CI / test (pull_request) Successful in 12s
CI / build-dev (pull_request) Has been skipped
Adds a gear button to the nav that opens a settings modal with:
- Export: GET /api/export returns all instances as a JSON backup file
  with a Content-Disposition attachment header
- Import: POST /api/import validates and bulk-replaces all instances;
  client uses FileReader to POST the parsed JSON, with a confirm dialog
  before destructive replace

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-28 14:10:59 -04:00
cd16b7ea28 Merge pull request 'v1.2.2' (#16) from dev into main
All checks were successful
CI / test (push) Successful in 13s
Release / release (push) Successful in 34s
CI / build-dev (push) Has been skipped
Reviewed-on: #16
2026-03-28 14:01:33 -04:00
20d8a13375 Merge pull request 'chore: bump version to 1.2.2' (#15) from chore/bump-v1.2.2 into dev
All checks were successful
CI / test (push) Successful in 14s
CI / build-dev (push) Successful in 24s
CI / test (pull_request) Successful in 14s
CI / build-dev (pull_request) Has been skipped
Reviewed-on: #15
2026-03-28 13:59:53 -04:00
f72aaa52f8 chore: bump version to 1.2.2
All checks were successful
CI / test (pull_request) Successful in 12s
CI / build-dev (pull_request) Has been skipped
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-28 13:59:23 -04:00
dd47d5006e Merge pull request 'fix: collapse python3 one-liner to fix YAML indentation error' (#14) from fix/release-yaml-indent into dev
All checks were successful
CI / test (push) Successful in 12s
CI / build-dev (push) Successful in 21s
Reviewed-on: #14
2026-03-28 13:58:26 -04:00
10e25e1803 fix: collapse python3 one-liner to fix YAML indentation error
All checks were successful
CI / test (pull_request) Successful in 13s
CI / build-dev (pull_request) Has been skipped
Multi-line python3 -c "..." had unindented code outside the run: | block,
causing 'yaml: line 83: could not find expected :'. Collapsed to a single
indented line so the YAML parser sees it correctly.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-28 13:57:48 -04:00
afbdefa549 Merge pull request 'v1.2.1' (#13) from dev into main
All checks were successful
CI / test (push) Successful in 14s
CI / build-dev (push) Has been skipped
Reviewed-on: #13
2026-03-28 13:55:34 -04:00
1a62e2fdd9 Merge pull request 'chore: bump version to 1.2.1' (#12) from chore/bump-v1.2.1 into dev
All checks were successful
CI / test (push) Successful in 15s
CI / build-dev (push) Successful in 28s
CI / test (pull_request) Successful in 12s
CI / build-dev (pull_request) Has been skipped
Reviewed-on: #12
2026-03-28 13:54:12 -04:00
1271c061fd chore: bump version to 1.2.1
All checks were successful
CI / test (pull_request) Successful in 16s
CI / build-dev (pull_request) Has been skipped
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-28 13:52:40 -04:00
7b2a996c21 Merge pull request 'fix: remove npm cache and fix release notes shell injection' (#11) from fix/release-workflow into dev
All checks were successful
CI / test (push) Successful in 13s
CI / build-dev (push) Successful in 20s
Reviewed-on: #11
2026-03-28 13:51:49 -04:00
3233d65db0 fix: remove npm cache and fix release notes shell injection
All checks were successful
CI / test (pull_request) Successful in 14s
CI / build-dev (pull_request) Has been skipped
cache: npm caused ~4min ETIMEDOUT on every run (cache service unreachable).

Commit messages containing backticks were shell-expanded inside the
curl -d "..." string, causing 'sha: No such file or directory'. Fixed by
writing release notes to a temp file and using python3 to build the JSON
payload, then passing it to curl with --data @file.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-28 13:49:38 -04:00
f1e192c5d4 Merge pull request 'v1.2.0' (#10) from dev into main
Some checks failed
CI / test (push) Successful in 13s
Release / release (push) Failing after 5m14s
CI / build-dev (push) Has been skipped
Reviewed-on: #10
2026-03-28 13:24:34 -04:00
3037381084 Merge pull request 'chore: bump version to 1.2.0' (#9) from chore/bump-v1.2.0 into dev
All checks were successful
CI / test (push) Successful in 13s
CI / build-dev (push) Successful in 26s
CI / test (pull_request) Successful in 13s
CI / build-dev (pull_request) Has been skipped
Reviewed-on: #9
2026-03-28 13:22:15 -04:00
e54c1d4848 chore: bump version to 1.2.0
All checks were successful
CI / test (pull_request) Successful in 13s
CI / build-dev (pull_request) Has been skipped
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-28 13:21:18 -04:00
3ae3f98df5 Merge pull request 'fix: use git rev-parse for short SHA in build-dev' (#8) from fix/ci-short-sha into dev
All checks were successful
CI / test (push) Successful in 12s
CI / build-dev (push) Successful in 20s
Reviewed-on: #8
2026-03-28 13:19:23 -04:00
65d6514603 fix: use git rev-parse for short SHA in build-dev
All checks were successful
CI / test (pull_request) Successful in 12s
CI / build-dev (pull_request) Has been skipped
$GITEA_SHA is unset on Gitea runners — the nav showed "dev-" with an
empty SHA. git rev-parse --short HEAD works regardless of runner env vars.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-28 13:18:47 -04:00
bc44bcbde9 Merge pull request 'fix: remove npm cache from setup-node' (#7) from fix/ci-remove-npm-cache into dev
All checks were successful
CI / test (push) Successful in 12s
CI / build-dev (push) Successful in 13s
Reviewed-on: #7
2026-03-28 13:16:33 -04:00
cae0f2222a fix: remove npm cache from setup-node
All checks were successful
CI / test (pull_request) Successful in 13s
CI / build-dev (pull_request) Has been skipped
The Gitea runner's cache service is unreachable, causing a ~4 minute
ETIMEDOUT on every run before falling back to a cold install anyway.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-28 13:11:05 -04:00
28833a7ec6 Merge pull request 'feat: show dev-<sha> version string in nav for dev builds' (#6) from feat/dev-version-string into dev
All checks were successful
CI / test (push) Successful in 9m33s
CI / build-dev (push) Successful in 22s
Reviewed-on: #6
2026-03-28 13:03:23 -04:00
6ba02bf17d feat: show dev-<sha> version string in nav for dev builds
All checks were successful
CI / test (pull_request) Successful in 9m31s
CI / build-dev (pull_request) Has been skipped
Production images continue to display the semver (v1.x.x). Dev images
built by CI now receive BUILD_VERSION=dev-<7-char-sha> via a Docker ARG,
and app.js skips the v prefix for non-semver strings.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-28 12:52:15 -04:00
bfe71b2511 Merge pull request 'fix: centre badge text on instance cards' (#5) from fix/badge-alignment into dev
All checks were successful
CI / test (push) Successful in 9m32s
CI / build-dev (push) Successful in 20s
Reviewed-on: #5
2026-03-28 12:38:53 -04:00
0f2a37cb39 fix: centre badge text on instance cards
All checks were successful
CI / test (pull_request) Successful in 9m31s
CI / build-dev (pull_request) Has been skipped
.badge lacked text-align: center. Inside the card's flex-end right
column, badge text was left-justified within each pill, making state
labels (deployed / testing / degraded) appear skewed to the left.

TDD: CSS regression test added to tests/helpers.test.js — reads
css/app.css directly and asserts the rule is present, so this
cannot regress silently in future.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-28 12:28:44 -04:00
73f4eabbc7 Merge pull request 'fix: db volume ownership and explicit error handling for write failures' (#3) from fix/db-permissions-and-error-handling into dev
All checks were successful
CI / test (push) Successful in 9m33s
CI / build-dev (push) Successful in 21s
Reviewed-on: #3
2026-03-28 12:10:32 -04:00
515ff8ddb3 Merge branch 'dev' into fix/db-permissions-and-error-handling
All checks were successful
CI / test (pull_request) Successful in 9m28s
CI / build-dev (pull_request) Has been skipped
2026-03-28 11:48:36 -04:00
08c12c9394 fix: skip db boot init in test env to prevent parallel worker lock
All checks were successful
CI / test (pull_request) Successful in 9m33s
CI / build-dev (pull_request) Has been skipped
Vitest runs test files in parallel workers. Each worker imports server/db.js,
which triggered module-level init(DEFAULT_PATH) unconditionally. Two workers
racing to open the same SQLite file caused "database is locked", followed
by process.exit(1) killing the worker — surfacing as:

  Error: process.exit unexpectedly called with "1"

Fix: guard the boot init block behind NODE_ENV !== 'test'. Vitest sets
NODE_ENV=test automatically. Each worker's beforeEach(() => _resetForTest())
initialises its own :memory: database, so no file coordination is needed.

process.exit(1) is also guarded by the same condition — it must never
fire inside a test runner process.

TDD: two regression tests added to tests/db.test.js documenting the
expected boot behaviour and proving the module loads cleanly in parallel.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-28 11:48:07 -04:00
4ce7df4649 Merge pull request 'fix: skip db boot init in test env to prevent parallel worker lock' (#4) from fix/db-boot-test-isolation into dev
Some checks failed
CI / test (push) Successful in 9m29s
CI / build-dev (push) Has been cancelled
Reviewed-on: #4
2026-03-28 11:41:55 -04:00
6c04a30c3a fix: skip db boot init in test env to prevent parallel worker lock
All checks were successful
CI / test (pull_request) Successful in 9m29s
CI / build-dev (pull_request) Has been skipped
Vitest runs test files in parallel workers. Each worker imports server/db.js,
which triggered module-level init(DEFAULT_PATH) unconditionally. Two workers
racing to open the same SQLite file caused "database is locked", followed
by process.exit(1) killing the worker — surfacing as:

  Error: process.exit unexpectedly called with "1"

Fix: guard the boot init block behind NODE_ENV !== 'test'. Vitest sets
NODE_ENV=test automatically. Each worker's beforeEach(() => _resetForTest())
initialises its own :memory: database, so no file coordination is needed.

process.exit(1) is also guarded by the same condition — it must never
fire inside a test runner process.

TDD: two regression tests added to tests/db.test.js documenting the
expected boot behaviour and proving the module loads cleanly in parallel.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-28 11:31:55 -04:00
c6cd8098fd Merge branch 'dev' into fix/db-permissions-and-error-handling
Some checks failed
CI / test (pull_request) Failing after 4m52s
CI / build-dev (pull_request) Has been skipped
2026-03-28 11:16:31 -04:00
15ed329743 fix: db volume ownership and explicit error handling for write failures
All checks were successful
CI / test (pull_request) Successful in 9m32s
Root cause of the 500 on create/update/delete: the non-root app user in
the Docker container lacked write permission to the volume mount point.
Docker volume mounts are owned by root by default; the app user (added
in a previous commit) could read the database but not write to it.

Fixes:

1. Dockerfile — RUN mkdir -p /app/data before chown so the directory
   exists in the image with correct ownership. Docker uses this as a
   seed when initialising a new named volume, ensuring the app user
   owns the mount point from the start.

   NOTE: existing volumes from before the non-root user was introduced
   will still be root-owned. Fix with:
     docker run --rm -v catalyst-data:/data alpine chown -R 1000:1000 /data

2. server/routes.js — replace bare `throw e` in POST/PUT catch blocks
   with console.error (route context + error) + explicit 500 response.
   Add try-catch to DELETE handler which previously had none. Unexpected
   DB errors now log the route they came from and return a clean JSON
   body instead of relying on the generic Express error handler.

3. server/db.js — wrap the boot init() call in try-catch. Fatal startup
   errors (e.g. data directory not writable) now print a clear message
   pointing to the cause before exiting, instead of a raw stack trace.

TDD: tests written first (RED), then fixed (GREEN). Six new tests in
tests/api.test.js verify that unexpected DB errors on POST, PUT, and
DELETE return 500 with { error: 'internal server error' } and call
console.error with the route context string.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-28 11:11:00 -04:00
1412b2e0b7 Merge pull request 'feat: build :dev Docker image on push to dev' (#1) from chore/dev-staging-build into dev
All checks were successful
CI / test (push) Successful in 9m29s
CI / build-dev (push) Successful in 13s
Reviewed-on: #1
2026-03-28 10:38:48 -04:00
30b037ff9c feat: build :dev Docker image on push to dev
All checks were successful
CI / test (pull_request) Successful in 9m30s
CI / build-dev (pull_request) Has been skipped
Adds a build-dev job to ci.yml that fires after tests pass on direct
pushes to dev (not PRs). Pushes two tags to the registry:

  :dev          — mutable, always the latest integrated dev state
  :dev-<sha>    — immutable, for tracing exactly which commit is running

Staging servers can pull :dev to test before a release PR is opened.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-28 10:27:23 -04:00
7a5b5d7afc chore: establish dev branch and branching workflow
All checks were successful
CI / test (push) Successful in 9m26s
Merges the initial ci.yml + release.yml workflow changes onto dev.
This is the first merge under the new feature-branch → dev → main model.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-28 10:16:00 -04:00
3383bee968 chore: replace build.yml with ci.yml + release.yml
Splits the single workflow into two with distinct responsibilities:

ci.yml    — runs tests on push/PR to dev and main. Powers the required
            status check for branch protection on both branches.

release.yml — triggers on push to main (merged PR). Reads version from
              package.json, asserts the tag doesn't already exist, creates
              the git tag, generates patch notes from commits since the
              previous tag, builds and pushes the Docker image, and creates
              the Gitea release. No more manual git tag or git push --tags.

build.yml deleted — all three of its jobs are covered by the new files.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-28 10:15:52 -04:00
0c30e4bd29 chore: release v1.1.2
All checks were successful
Build / test (push) Successful in 9m27s
Build / build (push) Successful in 27s
Build / release (push) Successful in 1s
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-28 09:52:57 -04:00
01f83d25f6 fix: SPA deep-link assets and broken home screen CSS
Three root causes addressed:

1. Added <base href="/"> to index.html so all relative asset paths
   (css/app.css, js/*.js) resolve from the root regardless of the
   current SPA route. Without this, /instance/117 requested
   /instance/css/app.css, which hit the SPA fallback and returned
   HTML; helmet's nosniff then refused it as a stylesheet.

2. Removed upgrade-insecure-requests from the CSP (useDefaults: false).
   This directive told browsers to upgrade HTTP→HTTPS for every asset
   request, breaking all resource loading on HTTP-only deployments.

3. Changed script-src-attr from 'none' to 'unsafe-inline' to allow
   the inline onclick handlers used throughout the UI.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-28 09:52:48 -04:00
79adc365d8 server/server.js — added helmet with CSP configured to allow Google Fonts
All checks were successful
Build / test (push) Successful in 9m29s
Build / release (push) Successful in 1s
Build / build (push) Successful in 32s
Dockerfile — creates a non-root app user and runs the process under it
server/routes.js — tailscale_ip validated against IPv4 regex (empty string still allowed)
index.html — sql.js CDN script tag already removed earlier in this session
2026-03-28 09:20:24 -04:00
22 changed files with 2392 additions and 222 deletions

View File

@@ -2,7 +2,8 @@
"permissions": {
"allow": [
"Bash(npm test:*)",
"Bash(npm install:*)"
"Bash(npm install:*)",
"Bash(find /c/Users/josh1/Documents/Code/Catalyst -type f \\\\\\(-name *.test.js -o -name *.spec.js -o -name .env* -o -name *.config.js \\\\\\))"
]
}
}

View File

@@ -1,84 +0,0 @@
name: Build
on:
push:
branches: [main]
tags:
- 'v*'
env:
IMAGE: ${{ vars.REGISTRY_HOST }}/${{ gitea.repository_owner }}/catalyst
jobs:
test:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v4
- name: Setup Node
uses: actions/setup-node@v4
with:
node-version: 'lts/*'
cache: npm
- name: Install dependencies
run: npm ci
- name: Run tests
run: npm test
build:
runs-on: ubuntu-latest
needs: test
if: startsWith(gitea.ref, 'refs/tags/v')
steps:
- name: Checkout
uses: actions/checkout@v4
- name: Docker metadata
id: meta
uses: docker/metadata-action@v5
with:
images: ${{ env.IMAGE }}
tags: |
type=semver,pattern={{version}}
type=semver,pattern={{major}}.{{minor}}
type=sha,prefix=,format=short
type=raw,value=latest,enable={{is_default_branch}}
- name: Log in to Gitea registry
uses: docker/login-action@v3
with:
registry: ${{ vars.REGISTRY_HOST }}
username: ${{ gitea.actor }}
password: ${{ secrets.TOKEN }}
- name: Build and push
uses: docker/build-push-action@v5
with:
context: .
push: true
tags: ${{ steps.meta.outputs.tags }}
release:
runs-on: ubuntu-latest
needs: build
if: startsWith(gitea.ref, 'refs/tags/v')
steps:
- name: Create release
run: |
curl -sf -X POST \
-H "Authorization: token ${{ secrets.TOKEN }}" \
-H "Content-Type: application/json" \
"${{ gitea.server_url }}/api/v1/repos/${{ gitea.repository }}/releases" \
-d "{
\"tag_name\": \"${{ gitea.ref_name }}\",
\"name\": \"Catalyst ${{ gitea.ref_name }}\",
\"body\": \"### Image\n\n\`${{ env.IMAGE }}:${{ gitea.ref_name }}\`\",
\"draft\": false,
\"prerelease\": false
}"

53
.gitea/workflows/ci.yml Normal file
View File

@@ -0,0 +1,53 @@
name: CI
on:
push:
branches: [dev, main]
pull_request:
branches: [dev, main]
env:
IMAGE: ${{ vars.REGISTRY_HOST }}/${{ gitea.repository_owner }}/catalyst
jobs:
test:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions/setup-node@v4
with:
node-version: 'lts/*'
- run: npm ci
- run: npm test
build-dev:
runs-on: ubuntu-latest
needs: test
if: github.event_name == 'push' && github.ref == 'refs/heads/dev'
steps:
- uses: actions/checkout@v4
- name: Log in to registry
uses: docker/login-action@v3
with:
registry: ${{ vars.REGISTRY_HOST }}
username: ${{ gitea.actor }}
password: ${{ secrets.TOKEN }}
- name: Compute short SHA
run: echo "SHORT_SHA=$(git rev-parse --short HEAD)" >> $GITEA_ENV
- name: Build and push
uses: docker/build-push-action@v5
with:
context: .
push: true
build-args: BUILD_VERSION=dev-${{ env.SHORT_SHA }}
tags: |
${{ env.IMAGE }}:dev
${{ env.IMAGE }}:dev-${{ gitea.sha }}

View File

@@ -0,0 +1,109 @@
name: Release
on:
push:
branches: [main]
env:
IMAGE: ${{ vars.REGISTRY_HOST }}/${{ gitea.repository_owner }}/catalyst
jobs:
release:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v4
with:
fetch-depth: 0
- uses: actions/setup-node@v4
with:
node-version: 'lts/*'
- run: npm ci
- run: npm test
- name: Read version
run: |
VERSION=$(node -p "require('./package.json').version")
echo "VERSION=${VERSION}" >> $GITEA_ENV
- name: Assert tag does not exist
run: |
if git ls-remote --tags origin "refs/tags/v${{ env.VERSION }}" | grep -q .; then
echo "ERROR: tag v${{ env.VERSION }} already exists — bump version in package.json before merging to main."
exit 1
fi
- name: Create and push tag
run: |
git config user.name "gitea-actions"
git config user.email "actions@gitea"
git tag "v${{ env.VERSION }}"
git push origin "v${{ env.VERSION }}"
- name: Generate release notes
run: |
LAST_TAG=$(git describe --tags --abbrev=0 HEAD^ 2>/dev/null || echo "")
if [ -n "$LAST_TAG" ]; then
git log "${LAST_TAG}..HEAD" --pretty=format:"- %s" --no-merges > /tmp/release_notes.txt
else
git log --pretty=format:"- %s" --no-merges > /tmp/release_notes.txt
fi
- name: Docker metadata
id: meta
uses: docker/metadata-action@v5
with:
images: ${{ env.IMAGE }}
tags: |
type=semver,pattern={{version}},value=v${{ env.VERSION }}
type=semver,pattern={{major}}.{{minor}},value=v${{ env.VERSION }}
type=sha,prefix=,format=short
type=raw,value=latest
- name: Log in to registry
uses: docker/login-action@v3
with:
registry: ${{ vars.REGISTRY_HOST }}
username: ${{ gitea.actor }}
password: ${{ secrets.TOKEN }}
- name: Build and push
uses: docker/build-push-action@v5
with:
context: .
push: true
tags: ${{ steps.meta.outputs.tags }}
- name: Create Gitea release
run: |
cat > /tmp/make_release.py << 'PYEOF'
import json, os
v = os.environ['VERSION']
img = os.environ['IMAGE']
raw = open('/tmp/release_notes.txt').read().strip()
feats, fixes = [], []
for line in raw.splitlines():
msg = line.lstrip('- ').strip()
if msg.startswith('feat:'):
feats.append('- ' + msg[5:].strip())
elif msg.startswith('fix:'):
fixes.append('- ' + msg[4:].strip())
sections = []
if feats:
sections.append('### New Features\n\n' + '\n'.join(feats))
if fixes:
sections.append('### Bug Fixes\n\n' + '\n'.join(fixes))
notes = '\n\n'.join(sections) or '_No changes_'
body = notes + '\n\n### Image\n\n`' + img + ':' + v + '`'
payload = {'tag_name': 'v'+v, 'name': 'Catalyst v'+v, 'body': body, 'draft': False, 'prerelease': False}
open('/tmp/release_body.json', 'w').write(json.dumps(payload))
PYEOF
python3 /tmp/make_release.py
curl -sf -X POST \
-H "Authorization: token ${{ secrets.TOKEN }}" \
-H "Content-Type: application/json" \
"${{ gitea.server_url }}/api/v1/repos/${{ gitea.repository }}/releases" \
--data @/tmp/release_body.json

1
.gitignore vendored
View File

@@ -1,5 +1,4 @@
node_modules/
js/version.js
data/*.db
data/*.db-shm
data/*.db-wal

View File

@@ -1,12 +1,22 @@
FROM node:lts-alpine
RUN addgroup -S app && adduser -S app -G app
WORKDIR /app
COPY package*.json ./
RUN npm ci --omit=dev
COPY . .
RUN awk -F'"' '/"version"/{printf "const VERSION = \"%s\";\n", $4; exit}' \
package.json > js/version.js
ARG BUILD_VERSION=""
RUN if [ -n "$BUILD_VERSION" ]; then \
printf 'const VERSION = "%s";\n' "$BUILD_VERSION" > js/version.js; \
else \
awk -F'"' '/"version"/{printf "const VERSION = \"%s\";\n", $4; exit}' \
package.json > js/version.js; \
fi
RUN mkdir -p /app/data && chown -R app:app /app
USER app
EXPOSE 3000
CMD ["node", "server/server.js"]

155
README.md
View File

@@ -1,39 +1,38 @@
# Catalyst
A self-hosted infrastructure registry. Track every VM, container, and service across your homelab — their state, stack, and which internal services are running on them.
A self-hosted infrastructure registry for homelab Proxmox environments. Track virtual machines across stacks, monitor service health, and maintain a full audit log of every configuration change.
---
## Features
- **Dashboard** — filterable, searchable instance list with state and stack badges
- **Detail pages** — per-instance view with service flags, Tailscale IP, and timestamps
- **Detail pages** — per-instance view with service flags, Tailscale IP, and a full change timeline
- **Audit log** — every field change is recorded with before/after values and a timestamp
- **Full CRUD** — add, edit, and delete instances via a clean modal interface
- **Production safeguard** — only development instances can be deleted; production instances must be demoted first
- **REST API** — every operation is a plain HTTP call; no magic, no framework lock-in
- **Persistent storage** — SQLite database on a Docker named volume; survives restarts and upgrades
- **Zero native dependencies** — SQLite via Node's built-in `node:sqlite`. No compilation, no binaries.
- **Export / import** — JSON backup and restore via the settings modal
- **REST API** — every operation is a plain HTTP call
- **Persistent storage** — SQLite on a Docker named volume; survives restarts and upgrades
- **Zero native dependencies** — SQLite via Node's built-in `node:sqlite`; no compilation, no binaries
---
## Quick start
```bash
docker run -d \
--name catalyst \
-p 3000:3000 \
-v catalyst-data:/app/data \
gitea.thewrightserver.net/josh/catalyst:latest
```
Or with the included Compose file:
```bash
docker compose up -d
```
Open [http://localhost:3000](http://localhost:3000).
### Environment variables
| Variable | Default | Description |
|---|---|---|
| `PORT` | `3000` | HTTP port the server binds to |
| `DB_PATH` | `data/catalyst.db` | Path to the SQLite database file |
---
## REST API
@@ -44,11 +43,11 @@ All endpoints are under `/api`. Request and response bodies are JSON.
#### `GET /api/instances`
Returns all instances, sorted by name. All query parameters are optional.
Returns all instances sorted by name. All query parameters are optional.
| Parameter | Type | Description |
|-----------|--------|-----------------------------------------|
| `search` | string | Partial match on `name` or `vmid` |
|---|---|---|
| `search` | string | Partial match on `name`, `vmid`, or `stack` |
| `state` | string | Exact match: `deployed`, `testing`, `degraded` |
| `stack` | string | Exact match: `production`, `development` |
@@ -64,11 +63,11 @@ GET /api/instances?search=plex&state=deployed
"state": "deployed",
"stack": "production",
"tailscale_ip": "100.64.0.1",
"atlas": 1, "argus": 0, "semaphore": 0,
"atlas": 1, "argus": 1, "semaphore": 0,
"patchmon": 1, "tailscale": 1, "andromeda": 0,
"hardware_acceleration": 1,
"created_at": "2024-01-15T10:30:00.000Z",
"updated_at": "2024-03-10T14:22:00.000Z"
"created_at": "2024-01-15T10:30:00",
"updated_at": "2024-03-10T14:22:00"
}
]
```
@@ -91,10 +90,43 @@ GET /api/instances/stacks
Returns a single instance by VMID.
| Status | Condition |
|--------|-----------|
|---|---|
| `200` | Instance found |
| `404` | No instance with that VMID |
| `400` | VMID is not a valid integer |
| `404` | No instance with that VMID |
---
#### `GET /api/instances/:vmid/history`
Returns the audit log for an instance — newest events first.
| Status | Condition |
|---|---|
| `200` | History returned (may be empty array) |
| `400` | VMID is not a valid integer |
| `404` | No instance with that VMID |
```json
[
{
"id": 3,
"vmid": 117,
"field": "state",
"old_value": "testing",
"new_value": "deployed",
"changed_at": "2024-03-10T14:22:00"
},
{
"id": 1,
"vmid": 117,
"field": "created",
"old_value": null,
"new_value": null,
"changed_at": "2024-01-15T10:30:00"
}
]
```
---
@@ -103,21 +135,21 @@ Returns a single instance by VMID.
Creates a new instance. Returns the created record.
| Status | Condition |
|--------|-----------|
|---|---|
| `201` | Created successfully |
| `400` | Validation error (see `errors` array in response) |
| `400` | Validation error see `errors` array in response |
| `409` | VMID already exists |
**Request body:**
| Field | Type | Required | Notes |
|-------|------|----------|-------|
|---|---|---|---|
| `name` | string | yes | |
| `vmid` | integer | yes | Must be > 0, unique |
| `vmid` | integer | yes | Must be > 0 and unique |
| `state` | string | yes | `deployed`, `testing`, or `degraded` |
| `stack` | string | yes | `production` or `development` |
| `tailscale_ip` | string | no | Defaults to `""` |
| `atlas` | 0\|1 | no | Defaults to `0` |
| `tailscale_ip` | string | no | Valid IPv4 or empty string |
| `atlas` | 0\|1 | no | |
| `argus` | 0\|1 | no | |
| `semaphore` | 0\|1 | no | |
| `patchmon` | 0\|1 | no | |
@@ -132,7 +164,7 @@ Creates a new instance. Returns the created record.
Replaces all fields on an existing instance. Accepts the same body shape as `POST`. The `vmid` in the body may differ from the URL — this is how you change a VMID.
| Status | Condition |
|--------|-----------|
|---|---|
| `200` | Updated successfully |
| `400` | Validation error |
| `404` | No instance with that VMID |
@@ -145,11 +177,36 @@ Replaces all fields on an existing instance. Accepts the same body shape as `POS
Deletes an instance. Only instances on the `development` stack may be deleted.
| Status | Condition |
|--------|-----------|
|---|---|
| `204` | Deleted successfully |
| `400` | VMID is not a valid integer |
| `404` | No instance with that VMID |
| `422` | Instance is on the `production` stack |
| `400` | VMID is not a valid integer |
---
### Backup
#### `GET /api/export`
Downloads a JSON backup of all instances as a file attachment.
```json
{
"version": 1,
"exported_at": "2024-03-10T14:22:00.000Z",
"instances": [ ... ]
}
```
#### `POST /api/import`
Replaces all instances from a JSON backup. Validates every row before committing — if any row is invalid the entire import is rejected.
| Status | Condition |
|---|---|
| `200` | Import successful — returns `{ "imported": N }` |
| `400` | Body missing `instances` array, or validation errors |
---
@@ -165,34 +222,22 @@ npm start # start the server on :3000
Tests are split across three files:
| File | What it covers |
|------|----------------|
| `tests/db.test.js` | SQLite data layer — all CRUD operations, constraints, filters |
|---|---|
| `tests/db.test.js` | SQLite data layer — CRUD, constraints, filters, history logging |
| `tests/api.test.js` | HTTP API — all endpoints, status codes, error cases |
| `tests/helpers.test.js` | UI helper functions — `esc()` XSS contract, `fmtDate()` |
| `tests/helpers.test.js` | UI helpers — `esc()` XSS contract, date formatting, history formatters |
---
## Versioning
Catalyst uses [semantic versioning](https://semver.org). `package.json` is the single source of truth for the version number.
Catalyst uses [semantic versioning](https://semver.org). `package.json` is the single source of truth.
| Change | Bump | Example |
|--------|------|---------|
| Bug fix | patch | `1.0.0``1.0.1` |
| New feature, backward compatible | minor | `1.0.0``1.1.0` |
| Breaking change | major | `1.0.0``2.0.0` |
| Change | Bump |
|---|---|
| Bug fix | patch |
| New feature, backward compatible | minor |
| Breaking change | major |
### Cutting a release
```bash
# 1. Bump version in package.json, then:
git add package.json
git commit -m "chore: release v1.1.0"
git tag v1.1.0
git push && git push --tags
```
Pushing a tag triggers the full CI pipeline: **test → build → release**.
- Docker image tagged `:1.1.0`, `:1.1`, and `:latest` in the Gitea registry
- A Gitea release is created at `v1.1.0`
Pushing a tag triggers the CI pipeline: **test → build → release**.
Docker images are tagged `:x.y.z`, `:x.y`, and `:latest`.

View File

@@ -25,6 +25,10 @@
--mono: 'JetBrains Mono', 'IBM Plex Mono', monospace;
}
html {
zoom: 1.1;
}
html, body {
height: 100%;
background: var(--bg);
@@ -70,6 +74,19 @@ nav {
.nav-sep { flex: 1; }
.nav-btn {
background: none;
border: 1px solid var(--border2);
color: var(--text2);
border-radius: 6px;
padding: 4px 8px;
font-size: 14px;
cursor: pointer;
margin-left: 10px;
line-height: 1;
}
.nav-btn:hover { border-color: var(--accent); color: var(--accent); }
.nav-divider { color: var(--border2); }
.nav-status {
@@ -136,6 +153,8 @@ main { flex: 1; }
}
.stat-cell:last-child { border-right: none; }
.stat-clickable { cursor: pointer; user-select: none; }
.stat-clickable:hover { background: var(--bg2); }
.stat-label {
font-size: 10px;
@@ -289,6 +308,7 @@ select:focus { border-color: var(--accent); }
border-radius: 3px;
letter-spacing: 0.08em;
text-transform: uppercase;
text-align: center;
}
.badge.deployed { background: var(--accent2); color: var(--accent); }
@@ -360,16 +380,25 @@ select:focus { border-color: var(--accent); }
}
.detail-sub {
font-size: 12px;
color: var(--text3);
margin-top: 6px;
font-size: 13px;
margin-top: 8px;
display: flex;
gap: 16px;
align-items: center;
gap: 0;
}
.detail-sub span { display: flex; gap: 4px; }
.detail-sub .lbl { color: var(--text3); }
.detail-sub .val { color: var(--text2); }
.detail-sub > span {
display: flex;
align-items: center;
gap: 6px;
}
.detail-sub > span + span {
margin-left: 12px;
padding-left: 12px;
border-left: 1px solid var(--border);
}
.detail-sub .lbl { color: var(--text3); font-size: 11px; }
.detail-sub .val { color: var(--text); }
.detail-actions { display: flex; gap: 8px; }
@@ -614,6 +643,58 @@ select:focus { border-color: var(--accent); }
.confirm-actions { display: flex; justify-content: flex-end; gap: 10px; }
/* ── HISTORY TIMELINE ── */
.tl-item {
display: flex;
align-items: center;
justify-content: space-between;
gap: 24px;
padding: 9px 0;
border-bottom: 1px solid var(--border);
}
.tl-item:last-child { border-bottom: none; }
.tl-event { display: flex; align-items: center; gap: 7px; font-size: 13px; min-width: 0; }
.tl-label { color: var(--text2); }
.tl-sep { color: var(--text3); user-select: none; }
.tl-old { color: var(--text3); text-decoration: line-through; font-size: 12px; }
.tl-arrow { color: var(--text3); font-size: 11px; }
.tl-new { color: var(--text); font-weight: 500; }
.tl-time { color: var(--text3); font-size: 11px; white-space: nowrap; flex-shrink: 0; }
.tl-deployed { color: var(--accent); }
.tl-testing { color: var(--amber); }
.tl-degraded { color: var(--red); }
.tl-created .tl-event { color: var(--accent); font-weight: 500; }
.tl-empty { color: var(--text3); font-size: 12px; padding: 8px 0; }
/* ── SETTINGS MODAL ── */
#settings-modal .modal-body { padding-top: 0; }
.settings-section { padding: 16px 0; border-bottom: 1px solid var(--border); }
.settings-section:last-child { border-bottom: none; padding-bottom: 0; }
.settings-section-title {
font-size: 10px;
font-weight: 600;
text-transform: uppercase;
letter-spacing: 0.1em;
color: var(--text3);
margin-bottom: 8px;
}
.settings-desc { font-size: 12px; color: var(--text2); margin: 0 0 14px; line-height: 1.6; }
.settings-row { display: flex; align-items: center; gap: 12px; }
.settings-label { font-size: 13px; color: var(--text2); white-space: nowrap; min-width: 80px; }
.settings-select { flex: 1; }
.import-row { display: flex; gap: 10px; align-items: center; }
.import-file-input { flex: 1; }
.btn-secondary {
background: var(--bg3);
border-color: var(--border2);
color: var(--text);
}
.btn-secondary:hover { border-color: var(--accent); color: var(--accent); }
.btn-danger { background: var(--red2); border-color: var(--red); color: var(--text); }
.btn-danger:hover { background: var(--red); }
/* ── SCROLLBAR ── */
::-webkit-scrollbar { width: 6px; }
::-webkit-scrollbar-track { background: var(--bg); }
@@ -633,3 +714,142 @@ select:focus { border-color: var(--accent); }
0%, 100% { opacity: 1; }
50% { opacity: 0; }
}
/* ── MOBILE ── */
@media (max-width: 640px) {
/* Reset desktop zoom — mobile browsers handle scaling themselves */
html { zoom: 1; }
/* Nav */
nav { padding: 0 16px; }
/* Dashboard header */
.dash-header { padding: 18px 16px 14px; }
/* Stats bar */
.stat-cell { padding: 10px 16px; }
/* Toolbar — search full-width on first row, filters + button below */
.toolbar { flex-wrap: wrap; padding: 10px 16px; gap: 8px; }
.search-wrap { max-width: 100%; }
.toolbar-right { margin-left: 0; width: 100%; justify-content: flex-end; }
/* Instance grid — single column */
.instance-grid {
grid-template-columns: 1fr;
padding: 12px 16px;
gap: 8px;
}
/* Detail page */
.detail-page { padding: 16px; }
/* Detail header — stack title block above actions */
.detail-header { flex-direction: column; align-items: flex-start; gap: 14px; }
/* Detail sub — wrap items when they don't fit */
.detail-sub { flex-wrap: wrap; row-gap: 4px; }
/* Detail grid — single column */
.detail-grid { grid-template-columns: 1fr; }
/* Toggle grid — 2 columns instead of 3 */
.toggle-grid { grid-template-columns: 1fr 1fr; }
/* Confirm box — no fixed width on mobile */
.confirm-box { width: auto; max-width: calc(100vw - 32px); padding: 18px; }
/* History timeline — stack timestamp above event */
.tl-item { flex-direction: column; align-items: flex-start; gap: 3px; }
.tl-time { order: -1; }
/* Toast — stretch across bottom */
.toast { right: 16px; left: 16px; bottom: 16px; }
/* Jobs — stack sidebar above detail */
.jobs-layout { grid-template-columns: 1fr; }
.jobs-sidebar { border-right: none; border-bottom: 1px solid var(--border); }
}
/* ── JOBS PAGE ───────────────────────────────────────────────────────────────── */
.jobs-layout {
display: grid;
grid-template-columns: 220px 1fr;
height: calc(100vh - 48px);
}
.jobs-sidebar {
border-right: 1px solid var(--border);
overflow-y: auto;
}
.jobs-sidebar-title {
padding: 16px 16px 8px;
font-size: 10px;
font-weight: 600;
text-transform: uppercase;
letter-spacing: 0.1em;
color: var(--text3);
}
.job-item {
display: flex;
align-items: center;
gap: 10px;
padding: 12px 16px;
cursor: pointer;
border-bottom: 1px solid var(--border);
user-select: none;
}
.job-item:hover, .job-item.active { background: var(--bg2); }
.job-item-name { font-size: 13px; color: var(--text); }
.jobs-detail {
padding: 28px 32px;
overflow-y: auto;
max-width: 600px;
}
.jobs-detail-hd { margin-bottom: 20px; }
.jobs-detail-title { font-size: 17px; font-weight: 600; color: var(--text); }
.jobs-detail-desc { font-size: 12px; color: var(--text2); margin-top: 4px; line-height: 1.6; }
.job-actions { display: flex; gap: 8px; margin: 16px 0 0; }
.jobs-placeholder { padding: 48px 32px; color: var(--text3); font-size: 13px; }
/* Shared job status dot */
.job-dot {
width: 7px;
height: 7px;
border-radius: 50%;
flex-shrink: 0;
display: inline-block;
}
.job-dot--success { background: var(--accent); }
.job-dot--error { background: var(--red); }
.job-dot--running { background: var(--amber); animation: pulse 2s ease-in-out infinite; }
.job-dot--none { background: var(--border2); }
/* Run history list */
.run-item {
display: grid;
grid-template-columns: 10px 1fr 60px 1fr;
gap: 0 12px;
padding: 7px 0;
border-bottom: 1px solid var(--border);
font-size: 12px;
align-items: baseline;
}
.run-item:last-child { border-bottom: none; }
.run-time { color: var(--text3); }
.run-status { color: var(--text2); }
.run-result { color: var(--text); }
.run-empty { color: var(--text3); font-size: 12px; padding: 8px 0; }
/* Nav dot */
.nav-job-dot {
display: inline-block;
width: 6px;
height: 6px;
border-radius: 50%;
margin-left: 5px;
vertical-align: middle;
}
.nav-job-dot--success { background: var(--accent); }
.nav-job-dot--error { background: var(--red); }
.nav-job-dot--none { display: none; }

View File

@@ -3,6 +3,7 @@
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<base href="/">
<title>Catalyst</title>
<link rel="preconnect" href="https://fonts.googleapis.com">
<link rel="preconnect" href="https://fonts.gstatic.com" crossorigin>
@@ -21,6 +22,8 @@
<span class="nav-divider">·</span>
<span id="nav-version"></span>
</div>
<button class="nav-btn" onclick="navigate('jobs')">Jobs <span id="nav-jobs-dot" class="nav-job-dot nav-job-dot--none"></span></button>
<button class="nav-btn" onclick="openSettingsModal()" title="Settings">&#9881;</button>
</nav>
<main>
@@ -45,6 +48,13 @@
<select id="filter-stack" onchange="filterInstances()">
<option value="">all stacks</option>
</select>
<select id="sort-field" onchange="filterInstances()">
<option value="name">name</option>
<option value="vmid">vmid</option>
<option value="updated_at">last updated</option>
<option value="created_at">last created</option>
</select>
<button id="sort-dir" class="btn" data-dir="asc" onclick="toggleSortDir()" title="reverse sort"></button>
<div class="toolbar-right">
<button class="btn primary" onclick="openNewModal()">+ new instance</button>
</div>
@@ -67,7 +77,6 @@
<div class="detail-name" id="detail-name"></div>
<div class="detail-sub">
<span><span class="lbl">vmid</span> <span class="val" id="detail-vmid-sub"></span></span>
<span><span class="lbl">id</span> <span class="val" id="detail-id-sub"></span></span>
<span><span class="lbl">created</span> <span class="val" id="detail-created-sub"></span></span>
</div>
</div>
@@ -90,12 +99,25 @@
<div class="services-grid" id="detail-services"></div>
</div>
<div class="detail-section full">
<div class="section-title">timestamps</div>
<div class="section-title">history</div>
<div id="detail-timestamps"></div>
</div>
</div>
</div>
</div>
<!-- JOBS PAGE -->
<div class="page" id="page-jobs">
<div class="jobs-layout">
<div class="jobs-sidebar">
<div class="jobs-sidebar-title">Jobs</div>
<div id="jobs-list"></div>
</div>
<div class="jobs-detail" id="jobs-detail">
<div class="jobs-placeholder">Select a job</div>
</div>
</div>
</div>
</main>
</div>
@@ -170,13 +192,44 @@
</div>
</div>
<!-- SETTINGS MODAL -->
<div id="settings-modal" class="modal-overlay">
<div class="modal">
<div class="modal-header">
<span class="modal-title">Settings</span>
<button class="modal-close" onclick="closeSettingsModal()">&#x2715;</button>
</div>
<div class="modal-body">
<div class="settings-section">
<div class="settings-section-title">Display</div>
<div class="settings-row">
<label class="settings-label" for="tz-select">Timezone</label>
<select id="tz-select" class="form-input settings-select"></select>
</div>
</div>
<div class="settings-section">
<div class="settings-section-title">Export</div>
<p class="settings-desc">Download all instance data as a JSON backup file.</p>
<button class="btn btn-secondary" onclick="exportDB()">Export Database</button>
</div>
<div class="settings-section">
<div class="settings-section-title">Import</div>
<p class="settings-desc">Restore from a backup file. This replaces all current instances.</p>
<div class="import-row">
<input type="file" id="import-file" accept=".json" class="form-input import-file-input">
<button class="btn btn-danger" onclick="importDB()">Import</button>
</div>
</div>
</div>
</div>
</div>
<!-- TOAST -->
<div class="toast" id="toast">
<div class="toast-dot"></div>
<span id="toast-msg"></span>
</div>
<script src="https://cdnjs.cloudflare.com/ajax/libs/sql.js/1.10.2/sql-wasm.js"></script>
<script src="js/version.js" onerror="window.VERSION=null"></script>
<script src="js/config.js"></script>
<script src="js/db.js"></script>

View File

@@ -11,12 +11,19 @@ function navigate(page, vmid) {
document.getElementById('page-detail').classList.add('active');
history.pushState({ page: 'instance', vmid }, '', `/instance/${vmid}`);
renderDetailPage(vmid);
} else if (page === 'jobs') {
document.getElementById('page-jobs').classList.add('active');
history.pushState({ page: 'jobs' }, '', '/jobs');
renderJobsPage();
}
}
function handleRoute() {
const m = window.location.pathname.match(/^\/instance\/(\d+)/);
if (m) {
if (window.location.pathname === '/jobs') {
document.getElementById('page-jobs').classList.add('active');
renderJobsPage();
} else if (m) {
document.getElementById('page-detail').classList.add('active');
renderDetailPage(parseInt(m[1], 10));
} else {
@@ -30,6 +37,9 @@ window.addEventListener('popstate', e => {
if (e.state?.page === 'instance') {
document.getElementById('page-detail').classList.add('active');
renderDetailPage(e.state.vmid);
} else if (e.state?.page === 'jobs') {
document.getElementById('page-jobs').classList.add('active');
renderJobsPage();
} else {
document.getElementById('page-dashboard').classList.add('active');
renderDashboard();
@@ -38,6 +48,11 @@ window.addEventListener('popstate', e => {
// ── Bootstrap ─────────────────────────────────────────────────────────────────
if (VERSION) document.getElementById('nav-version').textContent = `v${VERSION}`;
if (VERSION) {
const label = /^\d/.test(VERSION) ? `v${VERSION}` : VERSION;
document.getElementById('nav-version').textContent = label;
}
fetch('/api/jobs').then(r => r.json()).then(_updateJobsNavDot).catch(() => {});
handleRoute();

View File

@@ -55,3 +55,8 @@ async function updateInstance(vmid, data) {
async function deleteInstance(vmid) {
await api(`/instances/${vmid}`, { method: 'DELETE' });
}
async function getInstanceHistory(vmid) {
const res = await fetch(`${BASE}/instances/${vmid}/history`);
return res.json();
}

368
js/ui.js
View File

@@ -3,6 +3,34 @@ let editingVmid = null;
let currentVmid = null;
let toastTimer = null;
// ── Timezone ──────────────────────────────────────────────────────────────────
const TIMEZONES = [
{ label: 'UTC', tz: 'UTC' },
{ label: 'Hawaii (HST)', tz: 'Pacific/Honolulu' },
{ label: 'Alaska (AKT)', tz: 'America/Anchorage' },
{ label: 'Pacific (PT)', tz: 'America/Los_Angeles' },
{ label: 'Mountain (MT)', tz: 'America/Denver' },
{ label: 'Central (CT)', tz: 'America/Chicago' },
{ label: 'Eastern (ET)', tz: 'America/New_York' },
{ label: 'Atlantic (AT)', tz: 'America/Halifax' },
{ label: 'London (GMT/BST)', tz: 'Europe/London' },
{ label: 'Paris / Berlin (CET)', tz: 'Europe/Paris' },
{ label: 'Helsinki (EET)', tz: 'Europe/Helsinki' },
{ label: 'Istanbul (TRT)', tz: 'Europe/Istanbul' },
{ label: 'Dubai (GST)', tz: 'Asia/Dubai' },
{ label: 'India (IST)', tz: 'Asia/Kolkata' },
{ label: 'Singapore (SGT)', tz: 'Asia/Singapore' },
{ label: 'China (CST)', tz: 'Asia/Shanghai' },
{ label: 'Japan / Korea (JST/KST)', tz: 'Asia/Tokyo' },
{ label: 'Sydney (AEST)', tz: 'Australia/Sydney' },
{ label: 'Auckland (NZST)', tz: 'Pacific/Auckland' },
];
function getTimezone() {
return localStorage.getItem('catalyst_tz') || 'UTC';
}
// ── Helpers ───────────────────────────────────────────────────────────────────
function esc(str) {
@@ -11,17 +39,25 @@ function esc(str) {
return d.innerHTML;
}
// SQLite datetime('now') → 'YYYY-MM-DD HH:MM:SS' (UTC, no timezone marker).
// Appending 'Z' tells JS to parse it as UTC rather than local time.
function parseUtc(d) {
if (typeof d !== 'string') return new Date(d);
const hasZone = d.endsWith('Z') || /[+-]\d{2}:\d{2}$/.test(d);
return new Date(hasZone ? d : d.replace(' ', 'T') + 'Z');
}
function fmtDate(d) {
if (!d) return '—';
try {
return new Date(d).toLocaleDateString('en-US', { year: 'numeric', month: 'short', day: 'numeric' });
return parseUtc(d).toLocaleDateString('en-US', { year: 'numeric', month: 'short', day: 'numeric', timeZone: getTimezone() });
} catch (e) { return d; }
}
function fmtDateFull(d) {
if (!d) return '—';
try {
return new Date(d).toLocaleString('en-US', { year: 'numeric', month: 'short', day: 'numeric', hour: '2-digit', minute: '2-digit' });
return parseUtc(d).toLocaleString('en-US', { year: 'numeric', month: 'short', day: 'numeric', hour: '2-digit', minute: '2-digit', timeZone: getTimezone(), timeZoneName: 'short' });
} catch (e) { return d; }
}
@@ -35,11 +71,10 @@ async function renderDashboard() {
all.forEach(i => { states[i.state] = (states[i.state] || 0) + 1; });
document.getElementById('stats-bar').innerHTML = `
<div class="stat-cell"><div class="stat-label">total</div><div class="stat-value accent">${all.length}</div></div>
<div class="stat-cell"><div class="stat-label">deployed</div><div class="stat-value">${states['deployed'] || 0}</div></div>
<div class="stat-cell"><div class="stat-label">testing</div><div class="stat-value amber">${states['testing'] || 0}</div></div>
<div class="stat-cell"><div class="stat-label">degraded</div><div class="stat-value red">${states['degraded'] || 0}</div></div>
<div class="stat-cell"><div class="stat-label">stacks</div><div class="stat-value">${(await getDistinctStacks()).length}</div></div>
<div class="stat-cell stat-clickable" onclick="setStateFilter('')"><div class="stat-label">total</div><div class="stat-value accent">${all.length}</div></div>
<div class="stat-cell stat-clickable" onclick="setStateFilter('deployed')"><div class="stat-label">deployed</div><div class="stat-value">${states['deployed'] || 0}</div></div>
<div class="stat-cell stat-clickable" onclick="setStateFilter('testing')"><div class="stat-label">testing</div><div class="stat-value amber">${states['testing'] || 0}</div></div>
<div class="stat-cell stat-clickable" onclick="setStateFilter('degraded')"><div class="stat-label">degraded</div><div class="stat-value red">${states['degraded'] || 0}</div></div>
`;
await populateStackFilter();
@@ -60,11 +95,26 @@ async function populateStackFilter() {
});
}
function setStateFilter(state) {
document.getElementById('filter-state').value = state;
filterInstances();
}
function toggleSortDir() {
const btn = document.getElementById('sort-dir');
const next = btn.dataset.dir === 'asc' ? 'desc' : 'asc';
btn.dataset.dir = next;
btn.textContent = next === 'asc' ? '↑' : '↓';
filterInstances();
}
async function filterInstances() {
const search = document.getElementById('search-input').value;
const state = document.getElementById('filter-state').value;
const stack = document.getElementById('filter-stack').value;
const instances = await getInstances({ search, state, stack });
const sort = document.getElementById('sort-field').value;
const order = document.getElementById('sort-dir').dataset.dir || 'asc';
const instances = await getInstances({ search, state, stack, sort, order });
const grid = document.getElementById('instance-grid');
if (!instances.length) {
@@ -100,23 +150,50 @@ async function filterInstances() {
// ── Detail Page ───────────────────────────────────────────────────────────────
const BOOL_FIELDS = ['atlas','argus','semaphore','patchmon','tailscale','andromeda','hardware_acceleration'];
const FIELD_LABELS = {
name: 'name',
state: 'state',
stack: 'stack',
vmid: 'vmid',
tailscale_ip: 'tailscale ip',
atlas: 'atlas',
argus: 'argus',
semaphore: 'semaphore',
patchmon: 'patchmon',
tailscale: 'tailscale',
andromeda: 'andromeda',
hardware_acceleration: 'hw acceleration',
};
function stateClass(field, val) {
if (field !== 'state') return '';
return { deployed: 'tl-deployed', testing: 'tl-testing', degraded: 'tl-degraded' }[val] ?? '';
}
function fmtHistVal(field, val) {
if (val == null || val === '') return '—';
if (BOOL_FIELDS.includes(field)) return val === '1' ? 'on' : 'off';
return esc(val);
}
async function renderDetailPage(vmid) {
const inst = await getInstance(vmid);
const [inst, history, all] = await Promise.all([getInstance(vmid), getInstanceHistory(vmid), getInstances()]);
if (!inst) { navigate('dashboard'); return; }
currentVmid = vmid;
document.getElementById('nav-count').textContent = `${all.length} instance${all.length !== 1 ? 's' : ''}`;
document.getElementById('detail-vmid-crumb').textContent = vmid;
document.getElementById('detail-name').textContent = inst.name;
document.getElementById('detail-vmid-sub').textContent = inst.vmid;
document.getElementById('detail-id-sub').textContent = inst.id;
document.getElementById('detail-created-sub').textContent = fmtDate(inst.created_at);
document.getElementById('detail-identity').innerHTML = `
<div class="kv-row"><span class="kv-key">name</span><span class="kv-val highlight">${esc(inst.name)}</span></div>
<div class="kv-row"><span class="kv-key">state</span><span class="kv-val"><span class="badge ${esc(inst.state)}">${esc(inst.state)}</span></span></div>
<div class="kv-row"><span class="kv-key">stack</span><span class="kv-val highlight">${esc(inst.stack) || '—'}</span></div>
<div class="kv-row"><span class="kv-key">stack</span><span class="kv-val"><span class="badge ${esc(inst.stack)}">${esc(inst.stack) || '—'}</span></span></div>
<div class="kv-row"><span class="kv-key">vmid</span><span class="kv-val highlight">${inst.vmid}</span></div>
<div class="kv-row"><span class="kv-key">internal id</span><span class="kv-val">${inst.id}</span></div>
`;
document.getElementById('detail-network').innerHTML = `
@@ -134,10 +211,30 @@ async function renderDetailPage(vmid) {
</div>
`).join('');
document.getElementById('detail-timestamps').innerHTML = `
<div class="kv-row"><span class="kv-key">created</span><span class="kv-val">${fmtDateFull(inst.created_at)}</span></div>
<div class="kv-row"><span class="kv-key">updated</span><span class="kv-val">${fmtDateFull(inst.updated_at)}</span></div>
`;
document.getElementById('detail-timestamps').innerHTML = history.length
? history.map(e => {
if (e.field === 'created') return `
<div class="tl-item tl-created">
<span class="tl-event">instance created</span>
<span class="tl-time">${fmtDateFull(e.changed_at)}</span>
</div>`;
const label = FIELD_LABELS[e.field] ?? esc(e.field);
const newCls = (e.field === 'state' || e.field === 'stack')
? `badge ${esc(e.new_value)}`
: `tl-new ${stateClass(e.field, e.new_value)}`;
return `
<div class="tl-item">
<div class="tl-event">
<span class="tl-label">${label}</span>
<span class="tl-sep">·</span>
<span class="tl-old">${fmtHistVal(e.field, e.old_value)}</span>
<span class="tl-arrow">→</span>
<span class="${newCls}">${fmtHistVal(e.field, e.new_value)}</span>
</div>
<span class="tl-time">${fmtDateFull(e.changed_at)}</span>
</div>`;
}).join('')
: '<div class="tl-empty">no history yet</div>';
document.getElementById('detail-edit-btn').onclick = () => openEditModal(inst.vmid);
document.getElementById('detail-delete-btn').onclick = () => confirmDeleteDialog(inst);
@@ -207,6 +304,10 @@ async function saveInstance() {
hardware_acceleration: +document.getElementById('f-hardware-accel').checked,
};
// Snapshot job state before creation — jobs fire immediately after the 201
// so the baseline must be captured before the POST, not after.
const jobBaseline = !editingVmid ? await _snapshotJobBaseline() : null;
const result = editingVmid
? await updateInstance(editingVmid, data)
: await createInstance(data);
@@ -216,6 +317,8 @@ async function saveInstance() {
showToast(`${name} ${editingVmid ? 'updated' : 'created'}`, 'success');
closeModal();
if (jobBaseline) await _waitForOnCreateJobs(jobBaseline);
if (currentVmid && document.getElementById('page-detail').classList.contains('active')) {
await renderDetailPage(vmid);
} else {
@@ -223,6 +326,30 @@ async function saveInstance() {
}
}
async function _snapshotJobBaseline() {
const jobs = await fetch('/api/jobs').then(r => r.json());
return new Map(jobs.map(j => [j.id, j.last_run_id ?? null]));
}
async function _waitForOnCreateJobs(baseline) {
const jobs = await fetch('/api/jobs').then(r => r.json());
const relevant = jobs.filter(j => (j.config ?? {}).run_on_create);
if (!relevant.length) return;
const deadline = Date.now() + 30_000;
while (Date.now() < deadline) {
await new Promise(r => setTimeout(r, 500));
const current = await fetch('/api/jobs').then(r => r.json());
const allDone = relevant.every(j => {
const cur = current.find(c => c.id === j.id);
if (!cur) return true;
if (cur.last_run_id === baseline.get(j.id)) return false; // new run not started yet
return cur.last_status !== 'running'; // new run complete
});
if (allDone) return;
}
}
// ── Confirm Dialog ────────────────────────────────────────────────────────────
function confirmDeleteDialog(inst) {
@@ -258,12 +385,74 @@ function showToast(msg, type = 'success') {
toastTimer = setTimeout(() => t.classList.remove('show'), 3000);
}
// ── Settings Modal ────────────────────────────────────────────────────────────
function openSettingsModal() {
const sel = document.getElementById('tz-select');
if (!sel.options.length) {
for (const { label, tz } of TIMEZONES) {
const opt = document.createElement('option');
opt.value = tz;
opt.textContent = label;
sel.appendChild(opt);
}
}
sel.value = getTimezone();
document.getElementById('settings-modal').classList.add('open');
}
function closeSettingsModal() {
document.getElementById('settings-modal').classList.remove('open');
document.getElementById('import-file').value = '';
}
async function exportDB() {
const res = await fetch('/api/export');
const blob = await res.blob();
const url = URL.createObjectURL(blob);
const a = document.createElement('a');
a.href = url;
a.download = `catalyst-backup-${new Date().toISOString().slice(0, 10)}.json`;
a.click();
URL.revokeObjectURL(url);
}
async function importDB() {
const file = document.getElementById('import-file').files[0];
if (!file) { showToast('Select a backup file first', 'error'); return; }
document.getElementById('confirm-title').textContent = 'Replace all instances?';
document.getElementById('confirm-msg').textContent =
`This will delete all current instances and replace them with the contents of "${file.name}". This cannot be undone.`;
document.getElementById('confirm-overlay').classList.add('open');
document.getElementById('confirm-ok').onclick = async () => {
closeConfirm();
try {
const { instances, history = [], jobs, job_runs } = JSON.parse(await file.text());
const res = await fetch('/api/import', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({ instances, history, jobs, job_runs }),
});
const data = await res.json();
if (!res.ok) { showToast(data.error ?? 'Import failed', 'error'); return; }
const parts = [`${data.imported} instance${data.imported !== 1 ? 's' : ''}`];
if (data.imported_jobs != null) parts.push(`${data.imported_jobs} job${data.imported_jobs !== 1 ? 's' : ''}`);
showToast(`Imported ${parts.join(', ')}`, 'success');
closeSettingsModal();
renderDashboard();
} catch {
showToast('Invalid backup file', 'error');
}
};
}
// ── Keyboard / backdrop ───────────────────────────────────────────────────────
document.addEventListener('keydown', e => {
if (e.key !== 'Escape') return;
if (document.getElementById('instance-modal').classList.contains('open')) { closeModal(); return; }
if (document.getElementById('confirm-overlay').classList.contains('open')) { closeConfirm(); return; }
if (document.getElementById('settings-modal').classList.contains('open')) { closeSettingsModal(); return; }
});
document.getElementById('instance-modal').addEventListener('click', e => {
@@ -272,3 +461,150 @@ document.getElementById('instance-modal').addEventListener('click', e => {
document.getElementById('confirm-overlay').addEventListener('click', e => {
if (e.target === document.getElementById('confirm-overlay')) closeConfirm();
});
document.getElementById('settings-modal').addEventListener('click', e => {
if (e.target === document.getElementById('settings-modal')) closeSettingsModal();
});
document.getElementById('tz-select').addEventListener('change', e => {
localStorage.setItem('catalyst_tz', e.target.value);
const m = window.location.pathname.match(/^\/instance\/(\d+)/);
if (m) renderDetailPage(parseInt(m[1], 10));
else renderDashboard();
});
// ── Jobs Page ─────────────────────────────────────────────────────────────────
async function renderJobsPage() {
const jobs = await fetch('/api/jobs').then(r => r.json());
_updateJobsNavDot(jobs);
document.getElementById('jobs-list').innerHTML = jobs.length
? jobs.map(j => `
<div class="job-item" id="job-item-${j.id}" onclick="loadJobDetail(${j.id})">
<span class="job-dot job-dot--${j.last_status ?? 'none'}"></span>
<span class="job-item-name">${esc(j.name)}</span>
</div>`).join('')
: '<div class="jobs-placeholder">No jobs</div>';
if (jobs.length) loadJobDetail(jobs[0].id);
}
async function loadJobDetail(jobId) {
document.querySelectorAll('.job-item').forEach(el => el.classList.remove('active'));
document.getElementById(`job-item-${jobId}`)?.classList.add('active');
const job = await fetch(`/api/jobs/${jobId}`).then(r => r.json());
const cfg = job.config ?? {};
document.getElementById('jobs-detail').innerHTML = `
<div class="jobs-detail-hd">
<div class="jobs-detail-title">${esc(job.name)}</div>
<div class="jobs-detail-desc">${esc(job.description)}</div>
</div>
<div class="form-group">
<label class="form-label" style="display:flex;align-items:center;gap:8px;cursor:pointer">
<input type="checkbox" id="job-enabled" ${job.enabled ? 'checked' : ''}
style="accent-color:var(--accent);width:13px;height:13px">
Enable scheduled runs
</label>
</div>
<div class="form-group">
<label class="form-label" for="job-schedule">Poll interval (minutes)</label>
<input class="form-input" id="job-schedule" type="number" min="1" value="${job.schedule}" style="max-width:100px">
</div>
<div class="form-group">
<label class="form-label" style="display:flex;align-items:center;gap:8px;cursor:pointer">
<input type="checkbox" id="job-run-on-create" ${cfg.run_on_create ? 'checked' : ''}
style="accent-color:var(--accent);width:13px;height:13px">
Run on instance creation
</label>
</div>
${_renderJobConfigFields(job.key, cfg)}
<div class="job-actions">
<button class="btn btn-secondary" onclick="saveJobDetail(${job.id})">Save</button>
<button class="btn btn-secondary" id="job-run-btn" onclick="runJobNow(${job.id})">Run Now</button>
</div>
<div class="detail-section-title" style="margin:28px 0 10px">Run History</div>
${_renderRunList(job.runs)}
`;
}
function _renderJobConfigFields(key, cfg) {
if (key === 'tailscale_sync') return `
<div class="form-group">
<label class="form-label" for="job-cfg-tailnet">Tailnet</label>
<input class="form-input" id="job-cfg-tailnet" type="text"
placeholder="e.g. Tt3Btpm6D921CNTRL" value="${esc(cfg.tailnet ?? '')}">
</div>
<div class="form-group">
<label class="form-label" for="job-cfg-api-key">API Key</label>
<input class="form-input" id="job-cfg-api-key" type="password"
placeholder="tskey-api-…" value="${esc(cfg.api_key ?? '')}">
</div>`;
if (key === 'patchmon_sync' || key === 'semaphore_sync') {
const label = key === 'semaphore_sync' ? 'API Token (Bearer)' : 'API Token (Basic)';
return `
<div class="form-group">
<label class="form-label" for="job-cfg-api-url">API URL</label>
<input class="form-input" id="job-cfg-api-url" type="text"
value="${esc(cfg.api_url ?? '')}">
</div>
<div class="form-group">
<label class="form-label" for="job-cfg-api-token">${label}</label>
<input class="form-input" id="job-cfg-api-token" type="password"
value="${esc(cfg.api_token ?? '')}">
</div>`;
}
return '';
}
function _renderRunList(runs) {
if (!runs?.length) return '<div class="run-empty">No runs yet</div>';
return `<div class="run-list">${runs.map(r => `
<div class="run-item">
<span class="job-dot job-dot--${r.status}"></span>
<span class="run-time">${fmtDateFull(r.started_at)}</span>
<span class="run-status">${esc(r.status)}</span>
<span class="run-result">${esc(r.result)}</span>
</div>`).join('')}</div>`;
}
async function saveJobDetail(jobId) {
const enabled = document.getElementById('job-enabled').checked;
const schedule = document.getElementById('job-schedule').value;
const cfg = {};
const tailnet = document.getElementById('job-cfg-tailnet');
const apiKey = document.getElementById('job-cfg-api-key');
const apiUrl = document.getElementById('job-cfg-api-url');
const apiToken = document.getElementById('job-cfg-api-token');
if (tailnet) cfg.tailnet = tailnet.value.trim();
if (apiKey) cfg.api_key = apiKey.value;
if (apiUrl) cfg.api_url = apiUrl.value.trim();
if (apiToken) cfg.api_token = apiToken.value;
const runOnCreate = document.getElementById('job-run-on-create');
if (runOnCreate) cfg.run_on_create = runOnCreate.checked;
const res = await fetch(`/api/jobs/${jobId}`, {
method: 'PUT',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({ enabled, schedule: parseInt(schedule, 10), config: cfg }),
});
if (res.ok) { showToast('Job saved', 'success'); loadJobDetail(jobId); }
else { showToast('Failed to save', 'error'); }
}
async function runJobNow(jobId) {
const btn = document.getElementById('job-run-btn');
btn.disabled = true;
btn.textContent = 'Running…';
try {
const res = await fetch(`/api/jobs/${jobId}/run`, { method: 'POST' });
const data = await res.json();
if (res.ok) { showToast(`Done — ${data.summary}`, 'success'); loadJobDetail(jobId); }
else { showToast(data.error ?? 'Run failed', 'error'); }
} catch { showToast('Run failed', 'error'); }
finally { btn.disabled = false; btn.textContent = 'Run Now'; }
}
function _updateJobsNavDot(jobs) {
const dot = document.getElementById('nav-jobs-dot');
const cls = jobs.some(j => j.last_status === 'error') ? 'error'
: jobs.some(j => j.last_status === 'success') ? 'success'
: 'none';
dot.className = `nav-job-dot nav-job-dot--${cls}`;
}

1
js/version.js Normal file
View File

@@ -0,0 +1 @@
const VERSION = "1.5.0";

16
package-lock.json generated
View File

@@ -1,14 +1,15 @@
{
"name": "catalyst",
"version": "1.0.3",
"version": "1.1.0",
"lockfileVersion": 3,
"requires": true,
"packages": {
"": {
"name": "catalyst",
"version": "1.0.3",
"version": "1.1.0",
"dependencies": {
"express": "^4.18.0"
"express": "^4.18.0",
"helmet": "^8.1.0"
},
"devDependencies": {
"jsdom": "^25.0.0",
@@ -1958,6 +1959,15 @@
"node": ">= 0.4"
}
},
"node_modules/helmet": {
"version": "8.1.0",
"resolved": "https://registry.npmjs.org/helmet/-/helmet-8.1.0.tgz",
"integrity": "sha512-jOiHyAZsmnr8LqoPGmCjYAaiuWwjAPLgY8ZX2XrmHawt99/u1y6RgrZMTeoPfpUbV96HOalYgz1qzkRbw54Pmg==",
"license": "MIT",
"engines": {
"node": ">=18.0.0"
}
},
"node_modules/html-encoding-sniffer": {
"version": "4.0.0",
"resolved": "https://registry.npmjs.org/html-encoding-sniffer/-/html-encoding-sniffer-4.0.0.tgz",

View File

@@ -1,6 +1,6 @@
{
"name": "catalyst",
"version": "1.1.0",
"version": "1.6.0",
"type": "module",
"scripts": {
"start": "node server/server.js",
@@ -9,7 +9,8 @@
"version:write": "node -e \"const {version}=JSON.parse(require('fs').readFileSync('package.json','utf8'));require('fs').writeFileSync('js/version.js','const VERSION = \\\"'+version+'\\\";\\n');\""
},
"dependencies": {
"express": "^4.18.0"
"express": "^4.18.0",
"helmet": "^8.1.0"
},
"devDependencies": {
"jsdom": "^25.0.0",

View File

@@ -17,7 +17,7 @@ function init(path) {
db.exec('PRAGMA foreign_keys = ON');
db.exec('PRAGMA synchronous = NORMAL');
createSchema();
if (path !== ':memory:') seed();
if (path !== ':memory:') { seed(); seedJobs(); }
}
function createSchema() {
@@ -43,6 +43,41 @@ function createSchema() {
);
CREATE INDEX IF NOT EXISTS idx_instances_state ON instances(state);
CREATE INDEX IF NOT EXISTS idx_instances_stack ON instances(stack);
CREATE TABLE IF NOT EXISTS instance_history (
id INTEGER PRIMARY KEY AUTOINCREMENT,
vmid INTEGER NOT NULL,
field TEXT NOT NULL,
old_value TEXT,
new_value TEXT,
changed_at TEXT NOT NULL DEFAULT (datetime('now'))
);
CREATE INDEX IF NOT EXISTS idx_history_vmid ON instance_history(vmid);
CREATE TABLE IF NOT EXISTS config (
key TEXT PRIMARY KEY,
value TEXT NOT NULL DEFAULT ''
);
CREATE TABLE IF NOT EXISTS jobs (
id INTEGER PRIMARY KEY AUTOINCREMENT,
key TEXT NOT NULL UNIQUE,
name TEXT NOT NULL,
description TEXT NOT NULL DEFAULT '',
enabled INTEGER NOT NULL DEFAULT 0 CHECK(enabled IN (0,1)),
schedule INTEGER NOT NULL DEFAULT 15,
config TEXT NOT NULL DEFAULT '{}'
);
CREATE TABLE IF NOT EXISTS job_runs (
id INTEGER PRIMARY KEY AUTOINCREMENT,
job_id INTEGER NOT NULL,
started_at TEXT NOT NULL DEFAULT (datetime('now')),
ended_at TEXT,
status TEXT NOT NULL DEFAULT 'running' CHECK(status IN ('running','success','error')),
result TEXT NOT NULL DEFAULT ''
);
CREATE INDEX IF NOT EXISTS idx_job_runs_job_id ON job_runs(job_id);
`);
}
@@ -73,8 +108,33 @@ function seed() {
db.exec('COMMIT');
}
function seedJobs() {
const upsert = db.prepare(`
INSERT OR IGNORE INTO jobs (key, name, description, enabled, schedule, config)
VALUES (?, ?, ?, ?, ?, ?)
`);
const apiKey = getConfig('tailscale_api_key');
const tailnet = getConfig('tailscale_tailnet');
const tsSchedule = parseInt(getConfig('tailscale_poll_minutes', '15'), 10) || 15;
const tsEnabled = getConfig('tailscale_enabled') === '1' ? 1 : 0;
upsert.run('tailscale_sync', 'Tailscale Sync',
'Syncs Tailscale device status and IPs to instances by matching hostnames.',
tsEnabled, tsSchedule, JSON.stringify({ api_key: apiKey, tailnet }));
upsert.run('patchmon_sync', 'Patchmon Sync',
'Syncs Patchmon host registration status to instances by matching hostnames.',
0, 60, JSON.stringify({ api_url: 'http://patchmon:3000/api/v1/api/hosts', api_token: '' }));
upsert.run('semaphore_sync', 'Semaphore Sync',
'Syncs Semaphore inventory membership to instances by matching hostnames.',
0, 60, JSON.stringify({ api_url: 'http://semaphore:3000/api/project/1/inventory/1', api_token: '' }));
}
// ── Queries ───────────────────────────────────────────────────────────────────
const VALID_SORTS = ['name', 'vmid', 'updated_at', 'created_at'];
export function getInstances(filters = {}) {
const parts = ['SELECT * FROM instances WHERE 1=1'];
const params = {};
@@ -84,7 +144,11 @@ export function getInstances(filters = {}) {
}
if (filters.state) { parts.push('AND state = @state'); params.state = filters.state; }
if (filters.stack) { parts.push('AND stack = @stack'); params.stack = filters.stack; }
parts.push('ORDER BY name ASC');
const sortField = VALID_SORTS.includes(filters.sort) ? filters.sort : 'name';
const sortOrder = filters.order === 'desc' ? 'DESC' : 'ASC';
// id is a stable tiebreaker for timestamp fields (datetime precision is 1 s)
const tiebreaker = (sortField === 'created_at' || sortField === 'updated_at') ? `, id ${sortOrder}` : '';
parts.push(`ORDER BY ${sortField} ${sortOrder}${tiebreaker}`);
return db.prepare(parts.join(' ')).all(params);
}
@@ -99,8 +163,14 @@ export function getDistinctStacks() {
// ── Mutations ─────────────────────────────────────────────────────────────────
const HISTORY_FIELDS = [
'name', 'state', 'stack', 'vmid', 'tailscale_ip',
'atlas', 'argus', 'semaphore', 'patchmon', 'tailscale', 'andromeda',
'hardware_acceleration',
];
export function createInstance(data) {
return db.prepare(`
db.prepare(`
INSERT INTO instances
(name, state, stack, vmid, atlas, argus, semaphore, patchmon,
tailscale, andromeda, tailscale_ip, hardware_acceleration)
@@ -108,21 +178,158 @@ export function createInstance(data) {
(@name, @state, @stack, @vmid, @atlas, @argus, @semaphore, @patchmon,
@tailscale, @andromeda, @tailscale_ip, @hardware_acceleration)
`).run(data);
db.prepare(
`INSERT INTO instance_history (vmid, field, old_value, new_value, changed_at)
VALUES (?, 'created', NULL, NULL, strftime('%Y-%m-%dT%H:%M:%f', 'now'))`
).run(data.vmid);
}
export function updateInstance(vmid, data) {
return db.prepare(`
const old = getInstance(vmid);
db.prepare(`
UPDATE instances SET
name=@name, state=@state, stack=@stack, vmid=@newVmid,
atlas=@atlas, argus=@argus, semaphore=@semaphore, patchmon=@patchmon,
tailscale=@tailscale, andromeda=@andromeda, tailscale_ip=@tailscale_ip,
hardware_acceleration=@hardware_acceleration, updated_at=datetime('now')
hardware_acceleration=@hardware_acceleration, updated_at=strftime('%Y-%m-%dT%H:%M:%f', 'now')
WHERE vmid=@vmid
`).run({ ...data, newVmid: data.vmid, vmid });
const newVmid = data.vmid;
const insertEvt = db.prepare(
`INSERT INTO instance_history (vmid, field, old_value, new_value, changed_at)
VALUES (?, ?, ?, ?, strftime('%Y-%m-%dT%H:%M:%f', 'now'))`
);
for (const field of HISTORY_FIELDS) {
const oldVal = String(old[field] ?? '');
const newVal = String(field === 'vmid' ? newVmid : (data[field] ?? ''));
if (oldVal !== newVal) insertEvt.run(newVmid, field, oldVal, newVal);
}
}
export function deleteInstance(vmid) {
return db.prepare('DELETE FROM instances WHERE vmid = ?').run(vmid);
db.prepare('DELETE FROM instance_history WHERE vmid = ?').run(vmid);
db.prepare('DELETE FROM instances WHERE vmid = ?').run(vmid);
}
export function importInstances(rows, historyRows = []) {
db.exec('BEGIN');
db.exec('DELETE FROM instance_history');
db.exec('DELETE FROM instances');
const insert = db.prepare(`
INSERT INTO instances
(name, state, stack, vmid, atlas, argus, semaphore, patchmon,
tailscale, andromeda, tailscale_ip, hardware_acceleration)
VALUES
(@name, @state, @stack, @vmid, @atlas, @argus, @semaphore, @patchmon,
@tailscale, @andromeda, @tailscale_ip, @hardware_acceleration)
`);
for (const row of rows) insert.run(row);
if (historyRows.length) {
const insertHist = db.prepare(
`INSERT INTO instance_history (vmid, field, old_value, new_value, changed_at) VALUES (?, ?, ?, ?, ?)`
);
for (const h of historyRows) insertHist.run(h.vmid, h.field, h.old_value ?? null, h.new_value ?? null, h.changed_at);
}
db.exec('COMMIT');
}
export function getInstanceHistory(vmid) {
return db.prepare(
'SELECT * FROM instance_history WHERE vmid = ? ORDER BY changed_at DESC, id DESC'
).all(vmid);
}
export function getAllHistory() {
return db.prepare('SELECT * FROM instance_history ORDER BY vmid, changed_at').all();
}
export function getAllJobs() {
return db.prepare('SELECT id, key, name, description, enabled, schedule, config FROM jobs ORDER BY id').all();
}
export function getAllJobRuns() {
return db.prepare('SELECT * FROM job_runs ORDER BY job_id, id').all();
}
export function importJobs(jobRows, jobRunRows = []) {
db.exec('BEGIN');
db.exec('DELETE FROM job_runs');
db.exec('DELETE FROM jobs');
const insertJob = db.prepare(`
INSERT INTO jobs (id, key, name, description, enabled, schedule, config)
VALUES (@id, @key, @name, @description, @enabled, @schedule, @config)
`);
for (const j of jobRows) insertJob.run(j);
if (jobRunRows.length) {
const insertRun = db.prepare(`
INSERT INTO job_runs (id, job_id, started_at, ended_at, status, result)
VALUES (@id, @job_id, @started_at, @ended_at, @status, @result)
`);
for (const r of jobRunRows) insertRun.run(r);
}
db.exec('COMMIT');
}
export function getConfig(key, defaultVal = '') {
const row = db.prepare('SELECT value FROM config WHERE key = ?').get(key);
return row ? row.value : defaultVal;
}
export function setConfig(key, value) {
db.prepare(
`INSERT INTO config (key, value) VALUES (?, ?)
ON CONFLICT(key) DO UPDATE SET value = excluded.value`
).run(key, String(value));
}
// ── Jobs ──────────────────────────────────────────────────────────────────────
const JOB_WITH_LAST_RUN = `
SELECT j.*,
r.id AS last_run_id,
r.started_at AS last_run_at,
r.status AS last_status,
r.result AS last_result
FROM jobs j
LEFT JOIN job_runs r
ON r.id = (SELECT id FROM job_runs WHERE job_id = j.id ORDER BY id DESC LIMIT 1)
`;
export function getJobs() {
return db.prepare(JOB_WITH_LAST_RUN + ' ORDER BY j.id').all();
}
export function getJob(id) {
return db.prepare(JOB_WITH_LAST_RUN + ' WHERE j.id = ?').get(id) ?? null;
}
export function createJob(data) {
db.prepare(`
INSERT INTO jobs (key, name, description, enabled, schedule, config)
VALUES (@key, @name, @description, @enabled, @schedule, @config)
`).run(data);
}
export function updateJob(id, { enabled, schedule, config }) {
db.prepare(`
UPDATE jobs SET enabled=@enabled, schedule=@schedule, config=@config WHERE id=@id
`).run({ id, enabled, schedule, config });
}
export function createJobRun(jobId) {
return Number(db.prepare(
`INSERT INTO job_runs (job_id, started_at) VALUES (?, strftime('%Y-%m-%dT%H:%M:%f', 'now'))`
).run(jobId).lastInsertRowid);
}
export function completeJobRun(runId, status, result) {
db.prepare(`
UPDATE job_runs SET ended_at=strftime('%Y-%m-%dT%H:%M:%f', 'now'), status=@status, result=@result WHERE id=@id
`).run({ id: runId, status, result });
}
export function getJobRuns(jobId) {
return db.prepare('SELECT * FROM job_runs WHERE job_id = ? ORDER BY id DESC').all(jobId);
}
// ── Test helpers ──────────────────────────────────────────────────────────────
@@ -133,5 +340,18 @@ export function _resetForTest() {
}
// ── Boot ──────────────────────────────────────────────────────────────────────
// Skipped in test environment — parallel Vitest workers would race to open
// the same file, causing "database is locked". _resetForTest() in beforeEach
// handles initialisation for every test worker using :memory: instead.
init(process.env.DB_PATH ?? DEFAULT_PATH);
if (process.env.NODE_ENV !== 'test') {
const DB_PATH = process.env.DB_PATH ?? DEFAULT_PATH;
try {
init(DB_PATH);
} catch (e) {
console.error('[catalyst] fatal: could not open database at', DB_PATH);
console.error('[catalyst] ensure the data directory exists and is writable by the server process.');
console.error(e);
process.exit(1);
}
}

150
server/jobs.js Normal file
View File

@@ -0,0 +1,150 @@
import { getJobs, getJob, getInstances, updateInstance, createJobRun, completeJobRun } from './db.js';
// ── Handlers ──────────────────────────────────────────────────────────────────
const TAILSCALE_API = 'https://api.tailscale.com/api/v2';
async function tailscaleSyncHandler(cfg) {
const { api_key, tailnet } = cfg;
if (!api_key || !tailnet) throw new Error('Tailscale not configured — set API key and tailnet');
const res = await fetch(
`${TAILSCALE_API}/tailnet/${encodeURIComponent(tailnet)}/devices`,
{ headers: { Authorization: `Bearer ${api_key}` } }
);
if (!res.ok) throw new Error(`Tailscale API ${res.status}`);
const { devices } = await res.json();
const tsMap = new Map(
devices.map(d => [d.hostname, (d.addresses ?? []).find(a => a.startsWith('100.')) ?? ''])
);
const instances = getInstances();
let updated = 0;
for (const inst of instances) {
const tsIp = tsMap.get(inst.name);
const matched = tsIp !== undefined;
const newTailscale = matched ? 1 : (inst.tailscale === 1 ? 0 : inst.tailscale);
const newIp = matched ? tsIp : (inst.tailscale === 1 ? '' : inst.tailscale_ip);
if (newTailscale !== inst.tailscale || newIp !== inst.tailscale_ip) {
const { id: _id, created_at: _ca, updated_at: _ua, ...instData } = inst;
updateInstance(inst.vmid, { ...instData, tailscale: newTailscale, tailscale_ip: newIp });
updated++;
}
}
return { summary: `${updated} updated of ${instances.length}` };
}
// ── Patchmon Sync ─────────────────────────────────────────────────────────────
async function patchmonSyncHandler(cfg) {
const { api_url, api_token } = cfg;
if (!api_url || !api_token) throw new Error('Patchmon not configured — set API URL and token');
const res = await fetch(api_url, {
headers: { Authorization: `Basic ${api_token}` },
});
if (!res.ok) throw new Error(`Patchmon API ${res.status}`);
const data = await res.json();
const items = Array.isArray(data) ? data : (data.hosts ?? data.data ?? []);
const hostSet = new Set(
items.map(h => (typeof h === 'string' ? h : (h.name ?? h.hostname ?? h.host ?? '')))
.filter(Boolean)
);
const instances = getInstances();
let updated = 0;
for (const inst of instances) {
const newPatchmon = hostSet.has(inst.name) ? 1 : 0;
if (newPatchmon !== inst.patchmon) {
const { id: _id, created_at: _ca, updated_at: _ua, ...instData } = inst;
updateInstance(inst.vmid, { ...instData, patchmon: newPatchmon });
updated++;
}
}
return { summary: `${updated} updated of ${instances.length}` };
}
// ── Semaphore Sync ────────────────────────────────────────────────────────────
async function semaphoreSyncHandler(cfg) {
const { api_url, api_token } = cfg;
if (!api_url || !api_token) throw new Error('Semaphore not configured — set API URL and token');
const res = await fetch(api_url, {
headers: { Authorization: `Bearer ${api_token}` },
});
if (!res.ok) throw new Error(`Semaphore API ${res.status}`);
const data = await res.json();
// Inventory is an Ansible INI string; extract bare hostnames
const hostSet = new Set(
(data.inventory ?? '').split('\n')
.map(l => l.trim())
.filter(l => l && !l.startsWith('[') && !l.startsWith('#') && !l.startsWith(';'))
.map(l => l.split(/[\s=]/)[0])
.filter(Boolean)
);
const instances = getInstances();
let updated = 0;
for (const inst of instances) {
const newSemaphore = hostSet.has(inst.name) ? 1 : 0;
if (newSemaphore !== inst.semaphore) {
const { id: _id, created_at: _ca, updated_at: _ua, ...instData } = inst;
updateInstance(inst.vmid, { ...instData, semaphore: newSemaphore });
updated++;
}
}
return { summary: `${updated} updated of ${instances.length}` };
}
// ── Registry ──────────────────────────────────────────────────────────────────
const HANDLERS = {
tailscale_sync: tailscaleSyncHandler,
patchmon_sync: patchmonSyncHandler,
semaphore_sync: semaphoreSyncHandler,
};
// ── Public API ────────────────────────────────────────────────────────────────
export async function runJob(jobId) {
const job = getJob(jobId);
if (!job) throw new Error('Job not found');
const handler = HANDLERS[job.key];
if (!handler) throw new Error(`No handler for '${job.key}'`);
const cfg = JSON.parse(job.config || '{}');
const runId = createJobRun(jobId);
try {
const result = await handler(cfg);
completeJobRun(runId, 'success', result.summary ?? '');
return result;
} catch (e) {
completeJobRun(runId, 'error', e.message);
throw e;
}
}
const _intervals = new Map();
export async function runJobsOnCreate() {
for (const job of getJobs()) {
const cfg = JSON.parse(job.config || '{}');
if (cfg.run_on_create) {
try { await runJob(job.id); } catch (e) { console.error(`runJobsOnCreate job ${job.id}:`, e); }
}
}
}
export function restartJobs() {
for (const iv of _intervals.values()) clearInterval(iv);
_intervals.clear();
for (const job of getJobs()) {
if (!job.enabled) continue;
const ms = Math.max(1, job.schedule || 15) * 60_000;
const id = job.id;
_intervals.set(id, setInterval(() => runJob(id).catch(() => {}), ms));
}
}

View File

@@ -1,8 +1,11 @@
import { Router } from 'express';
import {
getInstances, getInstance, getDistinctStacks,
createInstance, updateInstance, deleteInstance,
createInstance, updateInstance, deleteInstance, importInstances, getInstanceHistory, getAllHistory,
getConfig, setConfig, getJobs, getJob, updateJob, getJobRuns,
getAllJobs, getAllJobRuns, importJobs,
} from './db.js';
import { runJob, restartJobs, runJobsOnCreate } from './jobs.js';
export const router = Router();
@@ -12,6 +15,15 @@ const VALID_STATES = ['deployed', 'testing', 'degraded'];
const VALID_STACKS = ['production', 'development'];
const SERVICE_KEYS = ['atlas', 'argus', 'semaphore', 'patchmon', 'tailscale', 'andromeda'];
const REDACTED = '**REDACTED**';
function maskJob(job) {
const cfg = JSON.parse(job.config || '{}');
if (cfg.api_key) cfg.api_key = REDACTED;
if (cfg.api_token) cfg.api_token = REDACTED;
return { ...job, config: cfg };
}
function validate(body) {
const errors = [];
if (!body.name || typeof body.name !== 'string' || !body.name.trim())
@@ -22,12 +34,22 @@ function validate(body) {
errors.push(`state must be one of: ${VALID_STATES.join(', ')}`);
if (!VALID_STACKS.includes(body.stack))
errors.push(`stack must be one of: ${VALID_STACKS.join(', ')}`);
const ip = (body.tailscale_ip ?? '').trim();
if (ip && !/^(\d{1,3}\.){3}\d{1,3}$/.test(ip))
errors.push('tailscale_ip must be a valid IPv4 address or empty');
return errors;
}
function handleDbError(context, e, res) {
if (e.message.includes('UNIQUE')) return res.status(409).json({ error: 'vmid already exists' });
if (e.message.includes('CHECK')) return res.status(400).json({ error: 'invalid field value' });
console.error(context, e);
res.status(500).json({ error: 'internal server error' });
}
function normalise(body) {
const row = {
name: body.name.trim(),
name: (body.name ?? '').trim(),
state: body.state,
stack: body.stack,
vmid: body.vmid,
@@ -47,8 +69,16 @@ router.get('/instances/stacks', (_req, res) => {
// GET /api/instances
router.get('/instances', (req, res) => {
const { search, state, stack } = req.query;
res.json(getInstances({ search, state, stack }));
const { search, state, stack, sort, order } = req.query;
res.json(getInstances({ search, state, stack, sort, order }));
});
// GET /api/instances/:vmid/history
router.get('/instances/:vmid/history', (req, res) => {
const vmid = parseInt(req.params.vmid, 10);
if (!vmid) return res.status(400).json({ error: 'invalid vmid' });
if (!getInstance(vmid)) return res.status(404).json({ error: 'instance not found' });
res.json(getInstanceHistory(vmid));
});
// GET /api/instances/:vmid
@@ -72,10 +102,9 @@ router.post('/instances', (req, res) => {
createInstance(data);
const created = getInstance(data.vmid);
res.status(201).json(created);
runJobsOnCreate().catch(() => {});
} catch (e) {
if (e.message.includes('UNIQUE')) return res.status(409).json({ error: 'vmid already exists' });
if (e.message.includes('CHECK')) return res.status(400).json({ error: 'invalid field value' });
throw e;
handleDbError('POST /api/instances', e, res);
}
});
@@ -93,9 +122,46 @@ router.put('/instances/:vmid', (req, res) => {
updateInstance(vmid, data);
res.json(getInstance(data.vmid));
} catch (e) {
if (e.message.includes('UNIQUE')) return res.status(409).json({ error: 'vmid already exists' });
if (e.message.includes('CHECK')) return res.status(400).json({ error: 'invalid field value' });
throw e;
handleDbError('PUT /api/instances/:vmid', e, res);
}
});
// GET /api/export
router.get('/export', (_req, res) => {
const instances = getInstances();
const history = getAllHistory();
const jobs = getAllJobs();
const job_runs = getAllJobRuns();
const date = new Date().toISOString().slice(0, 10);
res.setHeader('Content-Disposition', `attachment; filename="catalyst-backup-${date}.json"`);
res.json({ version: 3, exported_at: new Date().toISOString(), instances, history, jobs, job_runs });
});
// POST /api/import
router.post('/import', (req, res) => {
const { instances, history = [], jobs, job_runs } = req.body ?? {};
if (!Array.isArray(instances)) {
return res.status(400).json({ error: 'body must contain an instances array' });
}
const errors = [];
for (const [i, row] of instances.entries()) {
const errs = validate(normalise(row));
if (errs.length) errors.push({ index: i, errors: errs });
}
if (errors.length) return res.status(400).json({ errors });
try {
importInstances(instances.map(normalise), Array.isArray(history) ? history : []);
if (Array.isArray(jobs)) {
importJobs(jobs, Array.isArray(job_runs) ? job_runs : []);
try { restartJobs(); } catch (e) { console.error('POST /api/import restartJobs', e); }
}
res.json({
imported: instances.length,
imported_jobs: Array.isArray(jobs) ? jobs.length : undefined,
});
} catch (e) {
console.error('POST /api/import', e);
res.status(500).json({ error: 'internal server error' });
}
});
@@ -109,6 +175,56 @@ router.delete('/instances/:vmid', (req, res) => {
if (instance.stack !== 'development')
return res.status(422).json({ error: 'only development instances can be deleted' });
try {
deleteInstance(vmid);
res.status(204).end();
} catch (e) {
handleDbError('DELETE /api/instances/:vmid', e, res);
}
});
// GET /api/jobs
router.get('/jobs', (_req, res) => {
res.json(getJobs().map(maskJob));
});
// GET /api/jobs/:id
router.get('/jobs/:id', (req, res) => {
const id = parseInt(req.params.id, 10);
if (!id) return res.status(400).json({ error: 'invalid id' });
const job = getJob(id);
if (!job) return res.status(404).json({ error: 'job not found' });
res.json({ ...maskJob(job), runs: getJobRuns(id) });
});
// PUT /api/jobs/:id
router.put('/jobs/:id', (req, res) => {
const id = parseInt(req.params.id, 10);
if (!id) return res.status(400).json({ error: 'invalid id' });
const job = getJob(id);
if (!job) return res.status(404).json({ error: 'job not found' });
const { enabled, schedule, config: newCfg } = req.body ?? {};
const existingCfg = JSON.parse(job.config || '{}');
const mergedCfg = { ...existingCfg, ...(newCfg ?? {}) };
if (newCfg?.api_key === REDACTED) mergedCfg.api_key = existingCfg.api_key;
if (newCfg?.api_token === REDACTED) mergedCfg.api_token = existingCfg.api_token;
updateJob(id, {
enabled: enabled != null ? (enabled ? 1 : 0) : job.enabled,
schedule: schedule != null ? (parseInt(schedule, 10) || 15) : job.schedule,
config: JSON.stringify(mergedCfg),
});
try { restartJobs(); } catch (e) { console.error('PUT /api/jobs/:id restartJobs', e); }
res.json(maskJob(getJob(id)));
});
// POST /api/jobs/:id/run
router.post('/jobs/:id/run', async (req, res) => {
const id = parseInt(req.params.id, 10);
if (!id) return res.status(400).json({ error: 'invalid id' });
if (!getJob(id)) return res.status(404).json({ error: 'job not found' });
try {
res.json(await runJob(id));
} catch (e) {
handleDbError('POST /api/jobs/:id/run', e, res);
}
});

View File

@@ -1,13 +1,32 @@
import express from 'express';
import helmet from 'helmet';
import { fileURLToPath } from 'url';
import { dirname, join } from 'path';
import { router } from './routes.js';
import { restartJobs } from './jobs.js';
const __dirname = dirname(fileURLToPath(import.meta.url));
const PORT = process.env.PORT ?? 3000;
export const app = express();
app.use(helmet({
contentSecurityPolicy: {
useDefaults: false, // explicit — upgrade-insecure-requests breaks HTTP deployments
directives: {
'default-src': ["'self'"],
'base-uri': ["'self'"],
'font-src': ["'self'", 'https://fonts.gstatic.com'],
'form-action': ["'self'"],
'frame-ancestors': ["'self'"],
'img-src': ["'self'", 'data:'],
'object-src': ["'none'"],
'script-src': ["'self'"],
'script-src-attr': ["'unsafe-inline'"], // allow onclick handlers
'style-src': ["'self'", 'https://fonts.googleapis.com'],
},
},
}));
app.use(express.json());
// API
@@ -29,5 +48,6 @@ app.use((err, _req, res, _next) => {
// Boot — only when run directly, not when imported by tests
if (process.argv[1] === fileURLToPath(import.meta.url)) {
restartJobs();
app.listen(PORT, () => console.log(`catalyst on :${PORT}`));
}

View File

@@ -1,7 +1,8 @@
import { describe, it, expect, beforeEach } from 'vitest'
import { describe, it, expect, beforeEach, afterEach, vi } from 'vitest'
import request from 'supertest'
import { app } from '../server/server.js'
import { _resetForTest } from '../server/db.js'
import { _resetForTest, createJob } from '../server/db.js'
import * as dbModule from '../server/db.js'
beforeEach(() => _resetForTest())
@@ -73,6 +74,54 @@ describe('GET /api/instances', () => {
expect(res.body).toHaveLength(1)
expect(res.body[0].name).toBe('plex')
})
it('sorts by vmid ascending', async () => {
await request(app).post('/api/instances').send({ ...base, vmid: 200, name: 'b' })
await request(app).post('/api/instances').send({ ...base, vmid: 100, name: 'a' })
const res = await request(app).get('/api/instances?sort=vmid&order=asc')
expect(res.body[0].vmid).toBe(100)
expect(res.body[1].vmid).toBe(200)
})
it('sorts by vmid descending', async () => {
await request(app).post('/api/instances').send({ ...base, vmid: 100, name: 'a' })
await request(app).post('/api/instances').send({ ...base, vmid: 200, name: 'b' })
const res = await request(app).get('/api/instances?sort=vmid&order=desc')
expect(res.body[0].vmid).toBe(200)
expect(res.body[1].vmid).toBe(100)
})
it('sorts by name descending', async () => {
await request(app).post('/api/instances').send({ ...base, vmid: 1, name: 'alpha' })
await request(app).post('/api/instances').send({ ...base, vmid: 2, name: 'zebra' })
const res = await request(app).get('/api/instances?sort=name&order=desc')
expect(res.body[0].name).toBe('zebra')
expect(res.body[1].name).toBe('alpha')
})
it('sorts by created_at desc — id tiebreaker preserves insertion order', async () => {
await request(app).post('/api/instances').send({ ...base, vmid: 1, name: 'first' })
await request(app).post('/api/instances').send({ ...base, vmid: 2, name: 'second' })
const res = await request(app).get('/api/instances?sort=created_at&order=desc')
expect(res.body[0].name).toBe('second') // id=2 before id=1
expect(res.body[1].name).toBe('first')
})
it('sorts by updated_at desc — id tiebreaker preserves insertion order', async () => {
await request(app).post('/api/instances').send({ ...base, vmid: 1, name: 'a' })
await request(app).post('/api/instances').send({ ...base, vmid: 2, name: 'b' })
const res = await request(app).get('/api/instances?sort=updated_at&order=desc')
expect(res.body[0].name).toBe('b') // id=2 before id=1
expect(res.body[1].name).toBe('a')
})
it('ignores invalid sort field and falls back to name asc', async () => {
await request(app).post('/api/instances').send({ ...base, vmid: 1, name: 'zebra' })
await request(app).post('/api/instances').send({ ...base, vmid: 2, name: 'alpha' })
const res = await request(app).get('/api/instances?sort=bad_field')
expect(res.status).toBe(200)
expect(res.body[0].name).toBe('alpha')
})
})
// ── GET /api/instances/stacks ─────────────────────────────────────────────────
@@ -237,3 +286,424 @@ describe('DELETE /api/instances/:vmid', () => {
expect(res.status).toBe(400)
})
})
// ── GET /api/instances/:vmid/history ─────────────────────────────────────────
describe('GET /api/instances/:vmid/history', () => {
it('returns history events for a known vmid', async () => {
await request(app).post('/api/instances').send(base)
const res = await request(app).get('/api/instances/100/history')
expect(res.status).toBe(200)
expect(res.body).toBeInstanceOf(Array)
expect(res.body[0].field).toBe('created')
})
it('returns 404 for unknown vmid', async () => {
expect((await request(app).get('/api/instances/999/history')).status).toBe(404)
})
it('returns 400 for non-numeric vmid', async () => {
expect((await request(app).get('/api/instances/abc/history')).status).toBe(400)
})
})
// ── GET /api/export ───────────────────────────────────────────────────────────
describe('GET /api/export', () => {
it('returns 200 with instances array and attachment header', async () => {
await request(app).post('/api/instances').send(base)
const res = await request(app).get('/api/export')
expect(res.status).toBe(200)
expect(res.headers['content-disposition']).toMatch(/attachment/)
expect(res.body.instances).toHaveLength(1)
expect(res.body.instances[0].name).toBe('traefik')
})
it('returns empty instances array when no data', async () => {
const res = await request(app).get('/api/export')
expect(res.body.instances).toEqual([])
})
it('returns version 3', async () => {
const res = await request(app).get('/api/export')
expect(res.body.version).toBe(3)
})
it('includes a history array', async () => {
await request(app).post('/api/instances').send(base)
const res = await request(app).get('/api/export')
expect(res.body.history).toBeInstanceOf(Array)
expect(res.body.history.some(e => e.field === 'created')).toBe(true)
})
it('includes jobs and job_runs arrays', async () => {
createJob(testJob)
const res = await request(app).get('/api/export')
expect(res.body.jobs).toBeInstanceOf(Array)
expect(res.body.jobs).toHaveLength(1)
expect(res.body.jobs[0].key).toBe('tailscale_sync')
expect(res.body.job_runs).toBeInstanceOf(Array)
})
it('exports raw job config without masking', async () => {
createJob(testJob)
const res = await request(app).get('/api/export')
expect(res.body.jobs[0].config).toContain('tskey-test')
})
})
// ── POST /api/import ──────────────────────────────────────────────────────────
describe('POST /api/import', () => {
it('replaces all instances and returns imported count', async () => {
await request(app).post('/api/instances').send(base)
const res = await request(app).post('/api/import')
.send({ instances: [{ ...base, vmid: 999, name: 'imported' }] })
expect(res.status).toBe(200)
expect(res.body.imported).toBe(1)
expect((await request(app).get('/api/instances')).body[0].name).toBe('imported')
})
it('returns 400 if instances is not an array', async () => {
expect((await request(app).post('/api/import').send({ instances: 'bad' })).status).toBe(400)
})
it('returns 400 with per-row errors for invalid rows', async () => {
const res = await request(app).post('/api/import')
.send({ instances: [{ ...base, name: '', vmid: 1 }] })
expect(res.status).toBe(400)
expect(res.body.errors[0].index).toBe(0)
})
it('returns 400 if body has no instances key', async () => {
expect((await request(app).post('/api/import').send({})).status).toBe(400)
})
it('returns 400 (not 500) when a row is missing name', async () => {
const res = await request(app).post('/api/import')
.send({ instances: [{ ...base, name: undefined, vmid: 1 }] })
expect(res.status).toBe(400)
})
it('restores history when history array is provided', async () => {
await request(app).post('/api/instances').send(base)
const exp = await request(app).get('/api/export')
await request(app).post('/api/instances').send({ ...base, vmid: 999, name: 'other' })
const res = await request(app).post('/api/import').send({
instances: exp.body.instances,
history: exp.body.history,
})
expect(res.status).toBe(200)
const hist = await request(app).get('/api/instances/100/history')
expect(hist.body.some(e => e.field === 'created')).toBe(true)
})
it('succeeds with a v1 backup that has no history key', async () => {
const res = await request(app).post('/api/import')
.send({ instances: [{ ...base, vmid: 1, name: 'legacy' }] })
expect(res.status).toBe(200)
expect(res.body.imported).toBe(1)
})
it('imports jobs and job_runs and returns imported_jobs count', async () => {
const exp = await request(app).get('/api/export')
createJob(testJob)
const fullExport = await request(app).get('/api/export')
const res = await request(app).post('/api/import').send({
instances: fullExport.body.instances,
history: fullExport.body.history,
jobs: fullExport.body.jobs,
job_runs: fullExport.body.job_runs,
})
expect(res.status).toBe(200)
expect(res.body.imported_jobs).toBe(1)
expect((await request(app).get('/api/jobs')).body).toHaveLength(1)
})
it('leaves jobs untouched when no jobs key in payload', async () => {
createJob(testJob)
await request(app).post('/api/import')
.send({ instances: [{ ...base, vmid: 1, name: 'x' }] })
expect((await request(app).get('/api/jobs')).body).toHaveLength(1)
})
})
// ── Static assets & SPA routing ───────────────────────────────────────────────
describe('static assets and SPA routing', () => {
it('serves index.html at root', async () => {
const res = await request(app).get('/')
expect(res.status).toBe(200)
expect(res.headers['content-type']).toMatch(/html/)
})
it('serves index.html for deep SPA routes (e.g. /instance/117)', async () => {
const res = await request(app).get('/instance/117')
expect(res.status).toBe(200)
expect(res.headers['content-type']).toMatch(/html/)
})
it('serves CSS with correct content-type (not sniffed as HTML)', async () => {
const res = await request(app).get('/css/app.css')
expect(res.status).toBe(200)
expect(res.headers['content-type']).toMatch(/text\/css/)
})
it('does not set upgrade-insecure-requests in CSP (HTTP deployments must work)', async () => {
const res = await request(app).get('/')
const csp = res.headers['content-security-policy'] ?? ''
expect(csp).not.toContain('upgrade-insecure-requests')
})
it('allows inline event handlers in CSP (onclick attributes)', async () => {
const res = await request(app).get('/')
const csp = res.headers['content-security-policy'] ?? ''
// script-src-attr must not be 'none' — that blocks onclick handlers
expect(csp).not.toContain("script-src-attr 'none'")
})
it('index.html contains base href / for correct asset resolution on deep routes', async () => {
const res = await request(app).get('/')
expect(res.text).toContain('<base href="/">')
})
})
// ── Error handling — unexpected DB failures ───────────────────────────────────
const dbError = () => Object.assign(
new Error('attempt to write a readonly database'),
{ code: 'ERR_SQLITE_ERROR', errcode: 8 }
)
describe('error handling — unexpected DB failures', () => {
let consoleSpy
beforeEach(() => {
consoleSpy = vi.spyOn(console, 'error').mockImplementation(() => {})
})
afterEach(() => {
vi.restoreAllMocks()
})
it('POST returns 500 with friendly message when DB throws unexpectedly', async () => {
vi.spyOn(dbModule, 'createInstance').mockImplementationOnce(() => { throw dbError() })
const res = await request(app).post('/api/instances').send(base)
expect(res.status).toBe(500)
expect(res.body).toEqual({ error: 'internal server error' })
})
it('POST logs the error with route context when DB throws unexpectedly', async () => {
vi.spyOn(dbModule, 'createInstance').mockImplementationOnce(() => { throw dbError() })
await request(app).post('/api/instances').send(base)
expect(consoleSpy).toHaveBeenCalledWith(
expect.stringContaining('POST /api/instances'),
expect.any(Error)
)
})
it('PUT returns 500 with friendly message when DB throws unexpectedly', async () => {
await request(app).post('/api/instances').send(base)
vi.spyOn(dbModule, 'updateInstance').mockImplementationOnce(() => { throw dbError() })
const res = await request(app).put('/api/instances/100').send(base)
expect(res.status).toBe(500)
expect(res.body).toEqual({ error: 'internal server error' })
})
it('PUT logs the error with route context when DB throws unexpectedly', async () => {
await request(app).post('/api/instances').send(base)
vi.spyOn(dbModule, 'updateInstance').mockImplementationOnce(() => { throw dbError() })
await request(app).put('/api/instances/100').send(base)
expect(consoleSpy).toHaveBeenCalledWith(
expect.stringContaining('PUT /api/instances/:vmid'),
expect.any(Error)
)
})
it('DELETE returns 500 with friendly message when DB throws unexpectedly', async () => {
await request(app).post('/api/instances').send({ ...base, stack: 'development', state: 'testing' })
vi.spyOn(dbModule, 'deleteInstance').mockImplementationOnce(() => { throw dbError() })
const res = await request(app).delete('/api/instances/100')
expect(res.status).toBe(500)
expect(res.body).toEqual({ error: 'internal server error' })
})
it('DELETE logs the error with route context when DB throws unexpectedly', async () => {
await request(app).post('/api/instances').send({ ...base, stack: 'development', state: 'testing' })
vi.spyOn(dbModule, 'deleteInstance').mockImplementationOnce(() => { throw dbError() })
await request(app).delete('/api/instances/100')
expect(consoleSpy).toHaveBeenCalledWith(
expect.stringContaining('DELETE /api/instances/:vmid'),
expect.any(Error)
)
})
})
const testJob = {
key: 'tailscale_sync', name: 'Tailscale Sync', description: 'Test job',
enabled: 0, schedule: 15,
config: JSON.stringify({ api_key: 'tskey-test', tailnet: 'example.com' }),
}
const patchmonJob = {
key: 'patchmon_sync', name: 'Patchmon Sync', description: 'Test patchmon job',
enabled: 0, schedule: 60,
config: JSON.stringify({ api_url: 'http://patchmon:3000/api/v1/api/hosts', api_token: 'secret-token' }),
}
// ── GET /api/jobs ─────────────────────────────────────────────────────────────
describe('GET /api/jobs', () => {
it('returns empty array when no jobs', async () => {
const res = await request(app).get('/api/jobs')
expect(res.status).toBe(200)
expect(res.body).toEqual([])
})
it('returns jobs with masked api key', async () => {
createJob(testJob)
const res = await request(app).get('/api/jobs')
expect(res.body).toHaveLength(1)
expect(res.body[0].config.api_key).toBe('**REDACTED**')
})
it('returns jobs with masked api_token', async () => {
createJob(patchmonJob)
const res = await request(app).get('/api/jobs')
expect(res.body[0].config.api_token).toBe('**REDACTED**')
})
})
// ── GET /api/jobs/:id ─────────────────────────────────────────────────────────
describe('GET /api/jobs/:id', () => {
it('returns job with runs array', async () => {
createJob(testJob)
const id = (await request(app).get('/api/jobs')).body[0].id
const res = await request(app).get(`/api/jobs/${id}`)
expect(res.status).toBe(200)
expect(res.body.runs).toBeInstanceOf(Array)
})
it('returns 404 for unknown id', async () => {
expect((await request(app).get('/api/jobs/999')).status).toBe(404)
})
it('returns 400 for non-numeric id', async () => {
expect((await request(app).get('/api/jobs/abc')).status).toBe(400)
})
})
// ── PUT /api/jobs/:id ─────────────────────────────────────────────────────────
describe('PUT /api/jobs/:id', () => {
it('updates enabled and schedule', async () => {
createJob(testJob)
const id = (await request(app).get('/api/jobs')).body[0].id
const res = await request(app).put(`/api/jobs/${id}`).send({ enabled: true, schedule: 30 })
expect(res.status).toBe(200)
expect(res.body.enabled).toBe(1)
expect(res.body.schedule).toBe(30)
})
it('does not overwrite api_key when **REDACTED** is sent', async () => {
createJob(testJob)
const id = (await request(app).get('/api/jobs')).body[0].id
await request(app).put(`/api/jobs/${id}`).send({ config: { api_key: '**REDACTED**' } })
expect(dbModule.getJob(id).config).toContain('tskey-test')
})
it('returns 404 for unknown id', async () => {
expect((await request(app).put('/api/jobs/999').send({})).status).toBe(404)
})
})
// ── POST /api/jobs/:id/run ────────────────────────────────────────────────────
describe('POST /api/jobs/:id/run', () => {
afterEach(() => vi.unstubAllGlobals())
it('returns 404 for unknown id', async () => {
expect((await request(app).post('/api/jobs/999/run')).status).toBe(404)
})
it('runs job, returns summary, and logs the run', async () => {
createJob(testJob)
const id = (await request(app).get('/api/jobs')).body[0].id
vi.stubGlobal('fetch', vi.fn().mockResolvedValueOnce({
ok: true,
json: async () => ({ devices: [] }),
}))
const res = await request(app).post(`/api/jobs/${id}/run`)
expect(res.status).toBe(200)
expect(res.body.summary).toBeDefined()
const detail = await request(app).get(`/api/jobs/${id}`)
expect(detail.body.runs).toHaveLength(1)
expect(detail.body.runs[0].status).toBe('success')
})
it('logs error run on failure', async () => {
createJob(testJob)
const id = (await request(app).get('/api/jobs')).body[0].id
vi.stubGlobal('fetch', vi.fn().mockRejectedValueOnce(new Error('network error')))
const res = await request(app).post(`/api/jobs/${id}/run`)
expect(res.status).toBe(500)
const detail = await request(app).get(`/api/jobs/${id}`)
expect(detail.body.runs[0].status).toBe('error')
})
it('patchmon_sync: marks instances present in host list as patchmon=1', async () => {
createJob(patchmonJob)
const id = (await request(app).get('/api/jobs')).body[0].id
vi.stubGlobal('fetch', vi.fn().mockResolvedValueOnce({
ok: true,
json: async () => [{ name: 'plex' }, { name: 'traefik' }],
}))
const res = await request(app).post(`/api/jobs/${id}/run`)
expect(res.status).toBe(200)
expect(res.body.summary).toMatch(/updated of/)
})
it('patchmon_sync: returns 500 when API token is missing', async () => {
createJob({ ...patchmonJob, config: JSON.stringify({ api_url: 'http://patchmon:3000/api/v1/api/hosts', api_token: '' }) })
const id = (await request(app).get('/api/jobs')).body[0].id
const res = await request(app).post(`/api/jobs/${id}/run`)
expect(res.status).toBe(500)
})
it('run_on_create: triggers matching jobs when an instance is created', async () => {
createJob({ ...testJob, config: JSON.stringify({ api_key: 'k', tailnet: 't', run_on_create: true }) })
const id = (await request(app).get('/api/jobs')).body[0].id
vi.stubGlobal('fetch', vi.fn().mockResolvedValue({ ok: true, json: async () => ({ devices: [] }) }))
await request(app).post('/api/instances').send(base)
await new Promise(r => setImmediate(r))
const detail = await request(app).get(`/api/jobs/${id}`)
expect(detail.body.runs).toHaveLength(1)
expect(detail.body.runs[0].status).toBe('success')
})
it('run_on_create: does not trigger jobs without the flag', async () => {
createJob(testJob)
const id = (await request(app).get('/api/jobs')).body[0].id
await request(app).post('/api/instances').send(base)
await new Promise(r => setImmediate(r))
expect((await request(app).get(`/api/jobs/${id}`)).body.runs).toHaveLength(0)
})
it('semaphore_sync: parses ansible inventory and updates instances', async () => {
const semaphoreJob = {
key: 'semaphore_sync', name: 'Semaphore Sync', description: 'test',
enabled: 0, schedule: 60,
config: JSON.stringify({ api_url: 'http://semaphore:3000/api/project/1/inventory/1', api_token: 'bearer-token' }),
}
createJob(semaphoreJob)
const id = (await request(app).get('/api/jobs')).body[0].id
vi.stubGlobal('fetch', vi.fn().mockResolvedValueOnce({
ok: true,
json: async () => ({ inventory: '[production]\nplex\nhomeassistant\n' }),
}))
const res = await request(app).post(`/api/jobs/${id}/run`)
expect(res.status).toBe(200)
expect(res.body.summary).toMatch(/updated of/)
})
})

View File

@@ -2,7 +2,9 @@ import { describe, it, expect, beforeEach } from 'vitest'
import {
_resetForTest,
getInstances, getInstance, getDistinctStacks,
createInstance, updateInstance, deleteInstance,
createInstance, updateInstance, deleteInstance, importInstances, getInstanceHistory,
getConfig, setConfig,
getJobs, getJob, createJob, updateJob, createJobRun, completeJobRun, getJobRuns,
} from '../server/db.js'
beforeEach(() => _resetForTest());
@@ -56,6 +58,71 @@ describe('getInstances', () => {
createInstance({ name: 'plex2', state: 'degraded', stack: 'production', vmid: 2, atlas: 0, argus: 0, semaphore: 0, patchmon: 0, tailscale: 0, andromeda: 0, tailscale_ip: '', hardware_acceleration: 0 });
expect(getInstances({ search: 'plex', state: 'deployed' })).toHaveLength(1);
});
it('sorts by vmid ascending', () => {
createInstance({ name: 'b', state: 'deployed', stack: 'production', vmid: 200, atlas: 0, argus: 0, semaphore: 0, patchmon: 0, tailscale: 0, andromeda: 0, tailscale_ip: '', hardware_acceleration: 0 });
createInstance({ name: 'a', state: 'deployed', stack: 'production', vmid: 100, atlas: 0, argus: 0, semaphore: 0, patchmon: 0, tailscale: 0, andromeda: 0, tailscale_ip: '', hardware_acceleration: 0 });
const result = getInstances({ sort: 'vmid' });
expect(result[0].vmid).toBe(100);
expect(result[1].vmid).toBe(200);
});
it('sorts by vmid descending', () => {
createInstance({ name: 'a', state: 'deployed', stack: 'production', vmid: 100, atlas: 0, argus: 0, semaphore: 0, patchmon: 0, tailscale: 0, andromeda: 0, tailscale_ip: '', hardware_acceleration: 0 });
createInstance({ name: 'b', state: 'deployed', stack: 'production', vmid: 200, atlas: 0, argus: 0, semaphore: 0, patchmon: 0, tailscale: 0, andromeda: 0, tailscale_ip: '', hardware_acceleration: 0 });
const result = getInstances({ sort: 'vmid', order: 'desc' });
expect(result[0].vmid).toBe(200);
expect(result[1].vmid).toBe(100);
});
it('sorts by name descending', () => {
createInstance({ name: 'alpha', state: 'deployed', stack: 'production', vmid: 1, atlas: 0, argus: 0, semaphore: 0, patchmon: 0, tailscale: 0, andromeda: 0, tailscale_ip: '', hardware_acceleration: 0 });
createInstance({ name: 'zebra', state: 'deployed', stack: 'production', vmid: 2, atlas: 0, argus: 0, semaphore: 0, patchmon: 0, tailscale: 0, andromeda: 0, tailscale_ip: '', hardware_acceleration: 0 });
const result = getInstances({ sort: 'name', order: 'desc' });
expect(result[0].name).toBe('zebra');
expect(result[1].name).toBe('alpha');
});
it('sorts by created_at asc — id is tiebreaker when timestamps are equal (same second)', () => {
// datetime('now') has second precision; rapid inserts share the same timestamp.
// The implementation uses id ASC as secondary sort so insertion order is preserved.
createInstance({ name: 'first', state: 'deployed', stack: 'production', vmid: 1, atlas: 0, argus: 0, semaphore: 0, patchmon: 0, tailscale: 0, andromeda: 0, tailscale_ip: '', hardware_acceleration: 0 });
createInstance({ name: 'second', state: 'deployed', stack: 'production', vmid: 2, atlas: 0, argus: 0, semaphore: 0, patchmon: 0, tailscale: 0, andromeda: 0, tailscale_ip: '', hardware_acceleration: 0 });
const result = getInstances({ sort: 'created_at', order: 'asc' });
expect(result[0].name).toBe('first'); // id=1 before id=2
expect(result[1].name).toBe('second');
});
it('sorts by created_at desc — id is tiebreaker when timestamps are equal (same second)', () => {
createInstance({ name: 'first', state: 'deployed', stack: 'production', vmid: 1, atlas: 0, argus: 0, semaphore: 0, patchmon: 0, tailscale: 0, andromeda: 0, tailscale_ip: '', hardware_acceleration: 0 });
createInstance({ name: 'second', state: 'deployed', stack: 'production', vmid: 2, atlas: 0, argus: 0, semaphore: 0, patchmon: 0, tailscale: 0, andromeda: 0, tailscale_ip: '', hardware_acceleration: 0 });
const result = getInstances({ sort: 'created_at', order: 'desc' });
expect(result[0].name).toBe('second'); // id=2 before id=1
expect(result[1].name).toBe('first');
});
it('sorts by updated_at asc — id is tiebreaker when timestamps are equal (same second)', () => {
createInstance({ name: 'a', state: 'deployed', stack: 'production', vmid: 1, atlas: 0, argus: 0, semaphore: 0, patchmon: 0, tailscale: 0, andromeda: 0, tailscale_ip: '', hardware_acceleration: 0 });
createInstance({ name: 'b', state: 'deployed', stack: 'production', vmid: 2, atlas: 0, argus: 0, semaphore: 0, patchmon: 0, tailscale: 0, andromeda: 0, tailscale_ip: '', hardware_acceleration: 0 });
const result = getInstances({ sort: 'updated_at', order: 'asc' });
expect(result[0].name).toBe('a'); // id=1 before id=2
expect(result[1].name).toBe('b');
});
it('sorts by updated_at desc — id is tiebreaker when timestamps are equal (same second)', () => {
createInstance({ name: 'a', state: 'deployed', stack: 'production', vmid: 1, atlas: 0, argus: 0, semaphore: 0, patchmon: 0, tailscale: 0, andromeda: 0, tailscale_ip: '', hardware_acceleration: 0 });
createInstance({ name: 'b', state: 'deployed', stack: 'production', vmid: 2, atlas: 0, argus: 0, semaphore: 0, patchmon: 0, tailscale: 0, andromeda: 0, tailscale_ip: '', hardware_acceleration: 0 });
const result = getInstances({ sort: 'updated_at', order: 'desc' });
expect(result[0].name).toBe('b'); // id=2 before id=1
expect(result[1].name).toBe('a');
});
it('falls back to name asc for an invalid sort field', () => {
createInstance({ name: 'zebra', state: 'deployed', stack: 'production', vmid: 1, atlas: 0, argus: 0, semaphore: 0, patchmon: 0, tailscale: 0, andromeda: 0, tailscale_ip: '', hardware_acceleration: 0 });
createInstance({ name: 'alpha', state: 'deployed', stack: 'production', vmid: 2, atlas: 0, argus: 0, semaphore: 0, patchmon: 0, tailscale: 0, andromeda: 0, tailscale_ip: '', hardware_acceleration: 0 });
const result = getInstances({ sort: 'injected; DROP TABLE instances--' });
expect(result[0].name).toBe('alpha');
});
});
// ── getInstance ───────────────────────────────────────────────────────────────
@@ -164,4 +231,223 @@ describe('deleteInstance', () => {
expect(getInstance(1)).toBeNull();
expect(getInstance(2)).not.toBeNull();
});
it('clears history for the deleted instance', () => {
createInstance({ ...base, name: 'a', vmid: 1 });
deleteInstance(1);
expect(getInstanceHistory(1)).toHaveLength(0);
});
it('does not clear history for other instances', () => {
createInstance({ ...base, name: 'a', vmid: 1 });
createInstance({ ...base, name: 'b', vmid: 2 });
deleteInstance(1);
expect(getInstanceHistory(2).length).toBeGreaterThan(0);
});
});
// ── importInstances ───────────────────────────────────────────────────────────
describe('importInstances', () => {
const base = { state: 'deployed', stack: 'production', atlas: 0, argus: 0, semaphore: 0, patchmon: 0, tailscale: 0, andromeda: 0, tailscale_ip: '', hardware_acceleration: 0 };
it('replaces all existing instances with the imported set', () => {
createInstance({ ...base, name: 'old', vmid: 1 });
importInstances([{ ...base, name: 'new', vmid: 2 }]);
expect(getInstance(1)).toBeNull();
expect(getInstance(2)).not.toBeNull();
});
it('clears all instances when passed an empty array', () => {
createInstance({ ...base, name: 'a', vmid: 1 });
importInstances([]);
expect(getInstances()).toEqual([]);
});
it('clears history for all replaced instances', () => {
createInstance({ ...base, name: 'old', vmid: 1 });
importInstances([{ ...base, name: 'new', vmid: 2 }]);
expect(getInstanceHistory(1)).toHaveLength(0);
});
it('restores history rows when provided', () => {
importInstances(
[{ ...base, name: 'a', vmid: 1 }],
[{ vmid: 1, field: 'created', old_value: null, new_value: null, changed_at: '2026-01-01 00:00:00' }]
);
const h = getInstanceHistory(1);
expect(h.some(e => e.field === 'created')).toBe(true);
});
});
// ── instance history ─────────────────────────────────────────────────────────
describe('instance history', () => {
const base = { state: 'deployed', stack: 'production', atlas: 0, argus: 0, semaphore: 0, patchmon: 0, tailscale: 0, andromeda: 0, tailscale_ip: '', hardware_acceleration: 0 };
it('logs a created event when an instance is created', () => {
createInstance({ ...base, name: 'a', vmid: 1 });
const h = getInstanceHistory(1);
expect(h).toHaveLength(1);
expect(h[0].field).toBe('created');
});
it('logs changed fields when an instance is updated', () => {
createInstance({ ...base, name: 'a', vmid: 1 });
updateInstance(1, { ...base, name: 'a', vmid: 1, state: 'degraded' });
const h = getInstanceHistory(1);
const stateEvt = h.find(e => e.field === 'state');
expect(stateEvt).toBeDefined();
expect(stateEvt.old_value).toBe('deployed');
expect(stateEvt.new_value).toBe('degraded');
});
it('logs no events when nothing changes on update', () => {
createInstance({ ...base, name: 'a', vmid: 1 });
updateInstance(1, { ...base, name: 'a', vmid: 1 });
const h = getInstanceHistory(1).filter(e => e.field !== 'created');
expect(h).toHaveLength(0);
});
it('records history under the new vmid when vmid changes', () => {
createInstance({ ...base, name: 'a', vmid: 1 });
updateInstance(1, { ...base, name: 'a', vmid: 2 });
expect(getInstanceHistory(2).some(e => e.field === 'vmid')).toBe(true);
expect(getInstanceHistory(1).filter(e => e.field !== 'created')).toHaveLength(0);
});
});
// ── Test environment boot isolation ───────────────────────────────────────────
describe('test environment boot isolation', () => {
it('vitest runs with NODE_ENV=test', () => {
// Vitest sets NODE_ENV=test automatically. This is the guard condition
// that prevents the boot init() from opening the real database file.
expect(process.env.NODE_ENV).toBe('test');
});
it('db module loads cleanly in parallel workers without locking the real db file', () => {
// Regression: the module-level init(DEFAULT_PATH) used to run unconditionally,
// causing "database is locked" when multiple test workers imported db.js at
// the same time. process.exit(1) then killed the worker mid-suite.
// Fix: boot init is skipped when NODE_ENV=test. _resetForTest() handles setup.
// Reaching this line proves the module loaded without calling process.exit.
expect(() => _resetForTest()).not.toThrow();
expect(getInstances()).toEqual([]);
});
});
// ── getConfig / setConfig ─────────────────────────────────────────────────────
describe('getConfig / setConfig', () => {
it('returns defaultVal when key does not exist', () => {
expect(getConfig('missing', 'fallback')).toBe('fallback');
});
it('returns empty string by default', () => {
expect(getConfig('missing')).toBe('');
});
it('stores and retrieves a value', () => {
setConfig('tailscale_api_key', 'tskey-test');
expect(getConfig('tailscale_api_key')).toBe('tskey-test');
});
it('overwrites an existing key', () => {
setConfig('tailscale_enabled', '0');
setConfig('tailscale_enabled', '1');
expect(getConfig('tailscale_enabled')).toBe('1');
});
it('config is cleared by _resetForTest', () => {
setConfig('tailscale_api_key', 'tskey-test');
_resetForTest();
expect(getConfig('tailscale_api_key')).toBe('');
});
});
// ── jobs ──────────────────────────────────────────────────────────────────────
const baseJob = {
key: 'test_job', name: 'Test Job', description: 'desc',
enabled: 0, schedule: 15, config: '{}',
};
describe('jobs', () => {
it('returns empty array when no jobs', () => {
expect(getJobs()).toEqual([]);
});
it('createJob + getJobs returns the job', () => {
createJob(baseJob);
expect(getJobs()).toHaveLength(1);
expect(getJobs()[0].name).toBe('Test Job');
});
it('getJob returns null for unknown id', () => {
expect(getJob(999)).toBeNull();
});
it('updateJob changes enabled and schedule', () => {
createJob(baseJob);
const id = getJobs()[0].id;
updateJob(id, { enabled: 1, schedule: 30, config: '{}' });
expect(getJob(id).enabled).toBe(1);
expect(getJob(id).schedule).toBe(30);
});
it('getJobs includes last_status null when no runs', () => {
createJob(baseJob);
expect(getJobs()[0].last_status).toBeNull();
});
it('getJobs reflects last_status after a run', () => {
createJob(baseJob);
const id = getJobs()[0].id;
const runId = createJobRun(id);
completeJobRun(runId, 'success', 'ok');
expect(getJobs()[0].last_status).toBe('success');
});
});
// ── job_runs ──────────────────────────────────────────────────────────────────
describe('job_runs', () => {
it('createJobRun returns a positive id', () => {
createJob(baseJob);
const id = getJobs()[0].id;
expect(createJobRun(id)).toBeGreaterThan(0);
});
it('new run has status running and no ended_at', () => {
createJob(baseJob);
const id = getJobs()[0].id;
const runId = createJobRun(id);
const runs = getJobRuns(id);
expect(runs[0].status).toBe('running');
expect(runs[0].ended_at).toBeNull();
});
it('completeJobRun sets status, result, and ended_at', () => {
createJob(baseJob);
const id = getJobs()[0].id;
const runId = createJobRun(id);
completeJobRun(runId, 'success', '2 updated of 8');
const run = getJobRuns(id)[0];
expect(run.status).toBe('success');
expect(run.result).toBe('2 updated of 8');
expect(run.ended_at).not.toBeNull();
});
it('getJobRuns returns newest first', () => {
createJob(baseJob);
const id = getJobs()[0].id;
const r1 = createJobRun(id);
const r2 = createJobRun(id);
completeJobRun(r1, 'success', 'first');
completeJobRun(r2, 'error', 'second');
const runs = getJobRuns(id);
expect(runs[0].id).toBe(r2);
expect(runs[1].id).toBe(r1);
});
});

View File

@@ -1,5 +1,7 @@
// @vitest-environment jsdom
import { describe, it, expect } from 'vitest'
import { readFileSync } from 'fs'
import { join } from 'path'
// ── esc() ─────────────────────────────────────────────────────────────────────
// Mirrors the implementation in ui.js exactly (DOM-based).
@@ -56,16 +58,22 @@ describe('esc', () => {
// ── fmtDate() ─────────────────────────────────────────────────────────────────
function fmtDate(d) {
function parseUtc(d) {
if (typeof d !== 'string') return new Date(d)
const hasZone = d.endsWith('Z') || /[+-]\d{2}:\d{2}$/.test(d)
return new Date(hasZone ? d : d.replace(' ', 'T') + 'Z')
}
function fmtDate(d, tz = 'UTC') {
if (!d) return '—'
try {
return new Date(d).toLocaleDateString('en-US', { year: 'numeric', month: 'short', day: 'numeric' })
return parseUtc(d).toLocaleDateString('en-US', { year: 'numeric', month: 'short', day: 'numeric', timeZone: tz })
} catch (e) { return d }
}
describe('fmtDate', () => {
it('formats a valid ISO date string', () => {
const result = fmtDate('2024-03-15T00:00:00')
const result = fmtDate('2024-03-15T12:00:00Z')
expect(result).toMatch(/Mar/)
expect(result).toMatch(/15/)
expect(result).toMatch(/2024/)
@@ -86,24 +94,42 @@ describe('fmtDate', () => {
// ── fmtDateFull() ─────────────────────────────────────────────────────────────
function fmtDateFull(d) {
function fmtDateFull(d, tz = 'UTC') {
if (!d) return '—'
try {
return new Date(d).toLocaleString('en-US', {
return parseUtc(d).toLocaleString('en-US', {
year: 'numeric', month: 'short', day: 'numeric',
hour: '2-digit', minute: '2-digit',
timeZone: tz, timeZoneName: 'short',
})
} catch (e) { return d }
}
describe('fmtDateFull', () => {
it('includes date and time components', () => {
const result = fmtDateFull('2024-03-15T14:30:00')
const result = fmtDateFull('2024-03-15T14:30:00Z')
expect(result).toMatch(/Mar/)
expect(result).toMatch(/2024/)
expect(result).toMatch(/\d{1,2}:\d{2}/)
})
it('includes the timezone abbreviation', () => {
expect(fmtDateFull('2024-03-15T14:30:00Z', 'UTC')).toMatch(/UTC/)
})
it('converts to the given timezone', () => {
// 2024-03-15 18:30 UTC = 2024-03-15 14:30 EDT (UTC-4 in March)
const result = fmtDateFull('2024-03-15T18:30:00Z', 'America/New_York')
expect(result).toMatch(/2:30/)
expect(result).toMatch(/EDT/)
})
it('treats SQLite-format timestamps (space, no Z) as UTC', () => {
// SQLite datetime('now') → 'YYYY-MM-DD HH:MM:SS', no timezone marker.
// Must parse identically to the same moment expressed as ISO UTC.
expect(fmtDateFull('2024-03-15 18:30:00', 'UTC')).toBe(fmtDateFull('2024-03-15T18:30:00Z', 'UTC'))
})
it('returns — for null', () => {
expect(fmtDateFull(null)).toBe('—')
})
@@ -112,3 +138,111 @@ describe('fmtDateFull', () => {
expect(fmtDateFull('')).toBe('—')
})
})
// ── versionLabel() ───────────────────────────────────────────────────────────
// Mirrors the logic in app.js — semver strings get a v prefix, dev strings don't.
function versionLabel(v) {
return /^\d/.test(v) ? `v${v}` : v
}
describe('version label formatting', () => {
it('prepends v for semver strings', () => {
expect(versionLabel('1.1.2')).toBe('v1.1.2')
expect(versionLabel('2.0.0')).toBe('v2.0.0')
})
it('does not prepend v for dev build strings', () => {
expect(versionLabel('dev-abc1234')).toBe('dev-abc1234')
})
})
// ── fmtHistVal() ─────────────────────────────────────────────────────────────
// Mirrors the logic in ui.js — formats history field values for display.
const BOOL_FIELDS = ['atlas','argus','semaphore','patchmon','tailscale','andromeda','hardware_acceleration']
function fmtHistVal(field, val) {
if (val == null || val === '') return '—'
if (BOOL_FIELDS.includes(field)) return val === '1' ? 'on' : 'off'
return val
}
describe('fmtHistVal', () => {
it('returns — for null', () => {
expect(fmtHistVal('state', null)).toBe('—')
})
it('returns — for empty string', () => {
expect(fmtHistVal('state', '')).toBe('—')
})
it('returns on/off for boolean service fields', () => {
expect(fmtHistVal('atlas', '1')).toBe('on')
expect(fmtHistVal('atlas', '0')).toBe('off')
expect(fmtHistVal('hardware_acceleration', '1')).toBe('on')
})
it('returns the value as-is for non-boolean fields', () => {
expect(fmtHistVal('state', 'deployed')).toBe('deployed')
expect(fmtHistVal('name', 'plex')).toBe('plex')
expect(fmtHistVal('tailscale_ip', '100.64.0.1')).toBe('100.64.0.1')
})
})
// ── stateClass() ─────────────────────────────────────────────────────────────
// Mirrors the logic in ui.js — maps state values to timeline CSS classes.
function stateClass(field, val) {
if (field !== 'state') return ''
return { deployed: 'tl-deployed', testing: 'tl-testing', degraded: 'tl-degraded' }[val] ?? ''
}
describe('stateClass', () => {
it('returns empty string for non-state fields', () => {
expect(stateClass('name', 'plex')).toBe('')
expect(stateClass('stack', 'production')).toBe('')
})
it('returns the correct colour class for each state value', () => {
expect(stateClass('state', 'deployed')).toBe('tl-deployed')
expect(stateClass('state', 'testing')).toBe('tl-testing')
expect(stateClass('state', 'degraded')).toBe('tl-degraded')
})
it('returns empty string for unknown state values', () => {
expect(stateClass('state', 'unknown')).toBe('')
})
})
// ── CSS regressions ───────────────────────────────────────────────────────────
const css = readFileSync(join(__dirname, '../css/app.css'), 'utf8')
describe('CSS regressions', () => {
it('.badge has text-align: center so state labels are not left-skewed on cards', () => {
// Regression: badges rendered left-aligned inside the card's flex-end column.
// Without text-align: center, short labels (e.g. "deployed") appear
// left-justified inside their pill rather than centred.
expect(css).toMatch(/\.badge\s*\{[^}]*text-align\s*:\s*center/s)
})
})
// ── CI workflow regressions ───────────────────────────────────────────────────
const ciYml = readFileSync(join(__dirname, '../.gitea/workflows/ci.yml'), 'utf8')
describe('CI workflow regressions', () => {
it('build-dev job passes BUILD_VERSION build arg', () => {
// Regression: dev image showed semver instead of dev-<sha> because
// BUILD_VERSION was never passed to docker build.
expect(ciYml).toContain('BUILD_VERSION')
})
it('short SHA is computed with git rev-parse, not $GITEA_SHA (which is empty)', () => {
// Regression: ${GITEA_SHA::7} expands to "" on Gitea runners — nav showed "dev-".
// git rev-parse --short HEAD works regardless of which env vars the runner sets.
expect(ciYml).toContain('git rev-parse --short HEAD')
expect(ciYml).not.toContain('GITEA_SHA')
})
})