58 Commits

Author SHA1 Message Date
edf6f674b3 Merge pull request 'v1.6.0' (#65) from dev into main
All checks were successful
CI / test (push) Successful in 15s
Release / release (push) Successful in 52s
CI / build-dev (push) Has been skipped
Reviewed-on: #65
2026-03-28 21:01:27 -04:00
a8d367b4be Merge pull request 'chore: bump to version 1.6.0' (#64) from chore/bump-v1.6.0 into dev
All checks were successful
CI / test (push) Successful in 16s
CI / build-dev (push) Successful in 44s
CI / test (pull_request) Successful in 14s
CI / build-dev (pull_request) Has been skipped
Reviewed-on: #64
2026-03-28 20:59:21 -04:00
5ca0b648ca chore: bump to version 1.6.0
All checks were successful
CI / test (pull_request) Successful in 18s
CI / build-dev (pull_request) Has been skipped
2026-03-28 20:58:41 -04:00
518ed42f60 Merge pull request 'feat: make stats bar cells clickable to filter by state' (#62) from feat/jobs-system into dev
All checks were successful
CI / test (push) Successful in 17s
CI / build-dev (push) Successful in 27s
Reviewed-on: #62
2026-03-28 20:53:14 -04:00
a9147b0198 Merge branch 'dev' into feat/jobs-system
All checks were successful
CI / test (pull_request) Successful in 14s
CI / build-dev (pull_request) Has been skipped
2026-03-28 20:52:44 -04:00
2e3484b1d9 feat: make stats bar cells clickable to filter by state
All checks were successful
CI / test (pull_request) Successful in 14s
CI / build-dev (pull_request) Has been skipped
Clicking deployed/testing/degraded sets the state filter to that
value. Clicking total clears all filters. Hover style added.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-28 20:51:31 -04:00
cb83d11261 Merge pull request 'fix: config is already a parsed object from the jobs API response' (#61) from feat/jobs-system into dev
All checks were successful
CI / test (push) Successful in 14s
CI / build-dev (push) Successful in 26s
Reviewed-on: #61
2026-03-28 20:47:46 -04:00
047fd0653e Merge branch 'dev' into feat/jobs-system
All checks were successful
CI / test (pull_request) Successful in 13s
CI / build-dev (pull_request) Has been skipped
2026-03-28 20:47:18 -04:00
027ed52768 fix: config is already a parsed object from the jobs API response
All checks were successful
CI / test (pull_request) Successful in 14s
CI / build-dev (pull_request) Has been skipped
maskJob parses job.config before returning it, so calling JSON.parse
on it again threw an exception. The catch returned false for every
job, so relevant was always empty and _waitForOnCreateJobs returned
immediately without polling.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-28 20:46:49 -04:00
e2935c58c8 Merge pull request 'fix: capture job baseline before POST to avoid race condition' (#60) from feat/jobs-system into dev
All checks were successful
CI / test (push) Successful in 16s
CI / build-dev (push) Successful in 29s
Reviewed-on: #60
2026-03-28 20:43:26 -04:00
1bbe743dba fix: capture job baseline before POST to avoid race condition
All checks were successful
CI / test (pull_request) Successful in 14s
CI / build-dev (pull_request) Has been skipped
The previous version snapshotted last_run_id after the 201 response,
but jobs fire immediately server-side — by the time the client fetched
/api/jobs the runs were already complete, so the baseline matched the
new state and the poll loop never detected completion.

Baseline is now captured before the creation POST so it always
reflects pre-run state regardless of job speed.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-28 20:42:46 -04:00
d88b79e9f0 Merge pull request 'feat: auto-refresh UI after on-create jobs complete' (#59) from feat/jobs-system into dev
All checks were successful
CI / test (push) Successful in 14s
CI / build-dev (push) Successful in 29s
Reviewed-on: #59
2026-03-28 20:26:26 -04:00
8a9de6d72a Merge branch 'dev' into feat/jobs-system
All checks were successful
CI / test (pull_request) Successful in 15s
CI / build-dev (pull_request) Has been skipped
2026-03-28 20:25:55 -04:00
ddd528a682 feat: auto-refresh UI after on-create jobs complete
All checks were successful
CI / test (pull_request) Successful in 14s
CI / build-dev (pull_request) Has been skipped
After creating an instance, if any jobs have run_on_create enabled,
the client polls /api/jobs every 500ms until each relevant job has a
new completed run (tracked via last_run_id baseline). The dashboard
or detail page then refreshes automatically. 30s timeout as a safety
net if a job hangs.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-28 20:25:26 -04:00
03cf2aa9c6 Merge pull request 'fix: millisecond precision timestamps and correct history ordering' (#58) from feat/jobs-system into dev
All checks were successful
CI / test (push) Successful in 14s
CI / build-dev (push) Successful in 29s
Reviewed-on: #58
2026-03-28 20:20:42 -04:00
d84674b0c6 Merge branch 'dev' into feat/jobs-system
All checks were successful
CI / test (pull_request) Successful in 13s
CI / build-dev (pull_request) Has been skipped
2026-03-28 20:20:03 -04:00
7999f46ca2 fix: millisecond precision timestamps and correct history ordering
All checks were successful
CI / test (pull_request) Successful in 21s
CI / build-dev (pull_request) Has been skipped
datetime('now') only stores to the second, making same-second events
indistinguishable. Switched all instance_history and job_runs writes
to strftime('%Y-%m-%dT%H:%M:%f', 'now') for millisecond precision.

Reverted getInstanceHistory to ORDER BY changed_at DESC, id DESC so
newest events appear at the top and instance creation (lowest id,
earliest timestamp) is always at the bottom.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-28 20:19:42 -04:00
307c5cf9e8 Merge pull request 'fix: initialize jobs nav dot on every page load' (#57) from feat/jobs-system into dev
All checks were successful
CI / test (push) Successful in 14s
CI / build-dev (push) Successful in 28s
Reviewed-on: #57
2026-03-28 20:16:02 -04:00
34af8e5a8f Merge branch 'dev' into feat/jobs-system
All checks were successful
CI / test (pull_request) Successful in 14s
CI / build-dev (pull_request) Has been skipped
2026-03-28 20:15:37 -04:00
76d2bffb4f fix: initialize jobs nav dot on every page load
All checks were successful
CI / test (pull_request) Successful in 14s
CI / build-dev (pull_request) Has been skipped
Previously the dot only updated when visiting the Jobs page.
Now a jobs fetch runs at bootstrap so the dot reflects status
immediately on any page, including after a hard refresh.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-28 20:14:53 -04:00
64de0e432c Merge pull request 'fix: queue on-create jobs sequentially and fix history ordering' (#56) from feat/jobs-system into dev
All checks were successful
CI / test (push) Successful in 13s
CI / build-dev (push) Successful in 28s
Reviewed-on: #56
2026-03-28 20:12:31 -04:00
a5b409a348 Merge branch 'dev' into feat/jobs-system
All checks were successful
CI / test (pull_request) Successful in 14s
CI / build-dev (pull_request) Has been skipped
2026-03-28 20:09:59 -04:00
8f35724bde fix: queue on-create jobs sequentially and fix history ordering
All checks were successful
CI / test (pull_request) Successful in 13s
CI / build-dev (pull_request) Has been skipped
runJobsOnCreate now awaits each job before starting the next,
ensuring they don't stomp each other's DB writes in parallel.

getInstanceHistory changed to ORDER BY changed_at ASC, id ASC so
the creation event (lowest id) is always first regardless of
same-second timestamps.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-28 20:09:32 -04:00
cec82a3347 Merge pull request 'feat: run jobs on instance creation when run_on_create is enabled' (#54) from feat/jobs-system into dev
All checks were successful
CI / test (push) Successful in 17s
CI / build-dev (push) Successful in 34s
Reviewed-on: #54
2026-03-28 20:01:53 -04:00
883e59789b Merge branch 'dev' into feat/jobs-system
All checks were successful
CI / test (pull_request) Successful in 17s
CI / build-dev (pull_request) Has been skipped
2026-03-28 20:01:20 -04:00
817fdaef13 feat: run jobs on instance creation when run_on_create is enabled
All checks were successful
CI / test (pull_request) Successful in 18s
CI / build-dev (pull_request) Has been skipped
Jobs with run_on_create=true in their config fire automatically
after a new instance is created. Runs fire-and-forget so they don't
delay the 201 response. Option exposed as a checkbox in each job's
detail panel.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-28 20:00:45 -04:00
9295354e72 Merge pull request 'v1.5.0' (#53) from dev into main
All checks were successful
CI / test (push) Successful in 14s
Release / release (push) Successful in 47s
CI / build-dev (push) Has been skipped
Reviewed-on: #53
2026-03-28 19:51:29 -04:00
372cda6a58 Merge pull request 'chore: bump version to 1.5.0' (#52) from feat/jobs-system into dev
All checks were successful
CI / test (push) Successful in 14s
CI / build-dev (push) Successful in 38s
CI / test (pull_request) Successful in 13s
CI / build-dev (pull_request) Has been skipped
Reviewed-on: #52
2026-03-28 19:49:19 -04:00
3301e942ef Merge branch 'dev' into feat/jobs-system
All checks were successful
CI / test (pull_request) Successful in 14s
CI / build-dev (pull_request) Has been skipped
2026-03-28 19:48:48 -04:00
c4ebb76deb chore: bump version to 1.5.0
All checks were successful
CI / test (pull_request) Successful in 14s
CI / build-dev (pull_request) Has been skipped
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-28 19:48:16 -04:00
bb765453ab Merge pull request 'feat: include job config and run history in export/import backup' (#51) from feat/jobs-system into dev
All checks were successful
CI / test (push) Successful in 14s
CI / build-dev (push) Successful in 40s
Reviewed-on: #51
2026-03-28 19:44:37 -04:00
88474d1048 Merge branch 'dev' into feat/jobs-system
All checks were successful
CI / test (pull_request) Successful in 17s
CI / build-dev (pull_request) Has been skipped
2026-03-28 19:44:05 -04:00
954d85ca81 feat: include job config and run history in export/import backup
All checks were successful
CI / test (pull_request) Successful in 16s
CI / build-dev (pull_request) Has been skipped
Export bumped to version 3, now includes jobs (with raw unmasked
config) and job_runs arrays. Import restores them when present and
restarts the scheduler. Payloads without a jobs key leave jobs
untouched, keeping v1/v2 backups fully compatible.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-28 19:43:34 -04:00
117dfc5f17 Merge pull request 'feat: add Semaphore Sync job' (#50) from feat/jobs-system into dev
All checks were successful
CI / test (push) Successful in 15s
CI / build-dev (push) Successful in 27s
Reviewed-on: #50
2026-03-28 19:35:47 -04:00
c39c7a8aef Merge branch 'dev' into feat/jobs-system
All checks were successful
CI / test (pull_request) Successful in 19s
CI / build-dev (pull_request) Has been skipped
2026-03-28 19:35:10 -04:00
a934db1a14 feat: add Semaphore Sync job
All checks were successful
CI / test (pull_request) Successful in 15s
CI / build-dev (pull_request) Has been skipped
Fetches Semaphore project inventory via Bearer auth, parses the
Ansible INI format to extract hostnames, and sets semaphore=1/0
on matching instances.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-28 19:34:45 -04:00
ea4c5f7c95 Merge pull request 'feat: add Patchmon Sync job' (#49) from feat/jobs-system into dev
All checks were successful
CI / test (push) Successful in 14s
CI / build-dev (push) Successful in 25s
Reviewed-on: #49
2026-03-28 19:24:12 -04:00
5c12acb6c7 Merge branch 'dev' into feat/jobs-system
All checks were successful
CI / test (pull_request) Successful in 17s
CI / build-dev (pull_request) Has been skipped
2026-03-28 19:23:37 -04:00
0b350f3b28 feat: add Patchmon Sync job
All checks were successful
CI / test (pull_request) Successful in 14s
CI / build-dev (pull_request) Has been skipped
Syncs patchmon field on instances by querying the Patchmon hosts API
and matching hostnames. API token masked as REDACTED in responses.
seedJobs now uses INSERT OR IGNORE so new jobs are seeded on existing
installs without re-running the full seed.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-28 19:22:41 -04:00
db4071a2cf Merge pull request 'fix: move page-jobs inside main so it renders at the top' (#48) from feat/jobs-system into dev
All checks were successful
CI / test (push) Successful in 15s
CI / build-dev (push) Successful in 30s
Reviewed-on: #48
2026-03-28 19:15:38 -04:00
37cd77850e Merge branch 'dev' into feat/jobs-system
All checks were successful
CI / test (pull_request) Successful in 14s
CI / build-dev (pull_request) Has been skipped
2026-03-28 19:15:07 -04:00
14a4826bb6 fix: move page-jobs inside main so it renders at the top
All checks were successful
CI / test (pull_request) Successful in 14s
CI / build-dev (pull_request) Has been skipped
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-28 19:14:32 -04:00
550135ca37 Merge pull request 'feat: jobs system with dedicated nav page and run history' (#47) from feat/jobs-system into dev
All checks were successful
CI / test (push) Successful in 13s
CI / build-dev (push) Successful in 26s
Reviewed-on: #47
2026-03-28 19:10:50 -04:00
d7727badb1 feat: jobs system with dedicated nav page and run history
All checks were successful
CI / test (pull_request) Successful in 14s
CI / build-dev (pull_request) Has been skipped
Replaces ad-hoc Tailscale config tracking with a proper jobs system.
Jobs get their own nav page (master/detail layout), a dedicated DB
table, and full run history persisted forever. Tailscale connection
settings move from the Settings modal into the Jobs page. Registry
pattern makes adding future jobs straightforward.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-28 19:09:42 -04:00
537d78e71b Merge pull request 'feat: Tailscale sync jobs' (#46) from feat/tailscale-sync-jobs into dev
All checks were successful
CI / test (push) Successful in 13s
CI / build-dev (push) Successful in 25s
Reviewed-on: #46
2026-03-28 17:12:35 -04:00
47e9c4faf7 feat: Tailscale sync jobs
All checks were successful
CI / test (pull_request) Successful in 14s
CI / build-dev (pull_request) Has been skipped
Adds a background job system that polls the Tailscale API on a configurable
interval and syncs tailscale status and IPs to instances by hostname match.

- New config table (key/value) in SQLite for persistent server-side settings
- New server/jobs.js: runTailscaleSync + restartJobs scheduler
- GET/PUT /api/config — read and write Tailscale settings; API key masked as **REDACTED** on GET
- POST /api/jobs/tailscale/run — immediate manual sync
- Settings modal: new Tailscale Sync section with enable toggle, tailnet, API key, poll interval, Save + Run Now buttons, last-run status

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-28 17:11:40 -04:00
31a5090f4f Merge pull request 'fix: remove internal database ID from frontend' (#45) from fix/hide-internal-id into dev
All checks were successful
CI / test (push) Successful in 13s
CI / build-dev (push) Successful in 20s
Reviewed-on: #45
2026-03-28 16:48:19 -04:00
ecdac6fe23 fix: remove internal database ID from frontend
All checks were successful
CI / test (pull_request) Successful in 13s
CI / build-dev (pull_request) Has been skipped
Removed from the instance subtitle and the overview kv grid. The auto-
increment ID is an implementation detail with no user-facing meaning.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-28 16:47:20 -04:00
07cef73fae Merge pull request 'v1.4.0' (#44) from dev into main
All checks were successful
CI / test (push) Successful in 14s
Release / release (push) Successful in 39s
CI / build-dev (push) Has been skipped
Reviewed-on: #44
2026-03-28 16:16:46 -04:00
1a84edc064 Merge pull request 'chore: bump version to 1.4.0' (#43) from chore/bump-v1.4.0 into dev
All checks were successful
CI / test (push) Successful in 13s
CI / build-dev (push) Successful in 24s
CI / test (pull_request) Successful in 13s
CI / build-dev (pull_request) Has been skipped
Reviewed-on: #43
2026-03-28 16:15:32 -04:00
bfb2c26821 chore: bump version to 1.4.0
All checks were successful
CI / test (pull_request) Successful in 14s
CI / build-dev (pull_request) Has been skipped
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-28 16:09:08 -04:00
a985268987 Merge pull request 'feat: include history in export/import backup' (#42) from feat/export-import-history into dev
All checks were successful
CI / test (push) Successful in 12s
CI / build-dev (push) Successful in 21s
Reviewed-on: #42
2026-03-28 16:06:21 -04:00
218cdb08c5 feat: include history in export/import backup
All checks were successful
CI / test (pull_request) Successful in 15s
CI / build-dev (pull_request) Has been skipped
Export now returns version 2 with a history array alongside instances.
Import accepts the history array and restores all audit events. v1 backups
without a history key still import cleanly.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-28 16:04:53 -04:00
2855cc7f81 Merge pull request 'feat: mobile-responsive layout under 640px' (#41) from feat/mobile-responsive into dev
All checks were successful
CI / test (push) Successful in 13s
CI / build-dev (push) Successful in 23s
Reviewed-on: #41
2026-03-28 15:57:07 -04:00
07d2e215e4 Merge branch 'dev' into feat/mobile-responsive
All checks were successful
CI / test (pull_request) Successful in 13s
CI / build-dev (pull_request) Has been skipped
2026-03-28 15:56:38 -04:00
8ef839d6d0 Merge pull request 'fix: wrap image reference in backticks in release notes' (#40) from fix/release-image-codeblock into dev
All checks were successful
CI / test (push) Successful in 12s
CI / build-dev (push) Successful in 21s
Reviewed-on: #40
2026-03-28 15:55:05 -04:00
7af88328c8 feat: mobile-responsive layout under 640px
All checks were successful
CI / test (pull_request) Successful in 14s
CI / build-dev (pull_request) Has been skipped
Single breakpoint, no desktop changes. Key adjustments:
- Reset zoom: 1 (mobile browsers handle scaling)
- Padding drops from 32px to 16px throughout
- Toolbar wraps: search full-width, filters below
- Instance grid and detail grid collapse to single column
- Detail header stacks title above action buttons
- History timeline stacks timestamp above event
- Toggle grid drops from 3 to 2 columns
- Confirm box gets max-width: calc(100vw - 32px) to prevent overflow
- Toast stretches across bottom of screen

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-28 15:54:12 -04:00
096e2afb3d fix: wrap image reference in backticks in release notes
All checks were successful
CI / test (pull_request) Successful in 13s
CI / build-dev (pull_request) Has been skipped
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-28 15:44:15 -04:00
13 changed files with 1098 additions and 26 deletions

View File

@@ -97,7 +97,7 @@ jobs:
if fixes: if fixes:
sections.append('### Bug Fixes\n\n' + '\n'.join(fixes)) sections.append('### Bug Fixes\n\n' + '\n'.join(fixes))
notes = '\n\n'.join(sections) or '_No changes_' notes = '\n\n'.join(sections) or '_No changes_'
body = notes + '\n\n### Image\n\n' + img + ':' + v body = notes + '\n\n### Image\n\n`' + img + ':' + v + '`'
payload = {'tag_name': 'v'+v, 'name': 'Catalyst v'+v, 'body': body, 'draft': False, 'prerelease': False} payload = {'tag_name': 'v'+v, 'name': 'Catalyst v'+v, 'body': body, 'draft': False, 'prerelease': False}
open('/tmp/release_body.json', 'w').write(json.dumps(payload)) open('/tmp/release_body.json', 'w').write(json.dumps(payload))
PYEOF PYEOF

View File

@@ -153,6 +153,8 @@ main { flex: 1; }
} }
.stat-cell:last-child { border-right: none; } .stat-cell:last-child { border-right: none; }
.stat-clickable { cursor: pointer; user-select: none; }
.stat-clickable:hover { background: var(--bg2); }
.stat-label { .stat-label {
font-size: 10px; font-size: 10px;
@@ -712,3 +714,142 @@ select:focus { border-color: var(--accent); }
0%, 100% { opacity: 1; } 0%, 100% { opacity: 1; }
50% { opacity: 0; } 50% { opacity: 0; }
} }
/* ── MOBILE ── */
@media (max-width: 640px) {
/* Reset desktop zoom — mobile browsers handle scaling themselves */
html { zoom: 1; }
/* Nav */
nav { padding: 0 16px; }
/* Dashboard header */
.dash-header { padding: 18px 16px 14px; }
/* Stats bar */
.stat-cell { padding: 10px 16px; }
/* Toolbar — search full-width on first row, filters + button below */
.toolbar { flex-wrap: wrap; padding: 10px 16px; gap: 8px; }
.search-wrap { max-width: 100%; }
.toolbar-right { margin-left: 0; width: 100%; justify-content: flex-end; }
/* Instance grid — single column */
.instance-grid {
grid-template-columns: 1fr;
padding: 12px 16px;
gap: 8px;
}
/* Detail page */
.detail-page { padding: 16px; }
/* Detail header — stack title block above actions */
.detail-header { flex-direction: column; align-items: flex-start; gap: 14px; }
/* Detail sub — wrap items when they don't fit */
.detail-sub { flex-wrap: wrap; row-gap: 4px; }
/* Detail grid — single column */
.detail-grid { grid-template-columns: 1fr; }
/* Toggle grid — 2 columns instead of 3 */
.toggle-grid { grid-template-columns: 1fr 1fr; }
/* Confirm box — no fixed width on mobile */
.confirm-box { width: auto; max-width: calc(100vw - 32px); padding: 18px; }
/* History timeline — stack timestamp above event */
.tl-item { flex-direction: column; align-items: flex-start; gap: 3px; }
.tl-time { order: -1; }
/* Toast — stretch across bottom */
.toast { right: 16px; left: 16px; bottom: 16px; }
/* Jobs — stack sidebar above detail */
.jobs-layout { grid-template-columns: 1fr; }
.jobs-sidebar { border-right: none; border-bottom: 1px solid var(--border); }
}
/* ── JOBS PAGE ───────────────────────────────────────────────────────────────── */
.jobs-layout {
display: grid;
grid-template-columns: 220px 1fr;
height: calc(100vh - 48px);
}
.jobs-sidebar {
border-right: 1px solid var(--border);
overflow-y: auto;
}
.jobs-sidebar-title {
padding: 16px 16px 8px;
font-size: 10px;
font-weight: 600;
text-transform: uppercase;
letter-spacing: 0.1em;
color: var(--text3);
}
.job-item {
display: flex;
align-items: center;
gap: 10px;
padding: 12px 16px;
cursor: pointer;
border-bottom: 1px solid var(--border);
user-select: none;
}
.job-item:hover, .job-item.active { background: var(--bg2); }
.job-item-name { font-size: 13px; color: var(--text); }
.jobs-detail {
padding: 28px 32px;
overflow-y: auto;
max-width: 600px;
}
.jobs-detail-hd { margin-bottom: 20px; }
.jobs-detail-title { font-size: 17px; font-weight: 600; color: var(--text); }
.jobs-detail-desc { font-size: 12px; color: var(--text2); margin-top: 4px; line-height: 1.6; }
.job-actions { display: flex; gap: 8px; margin: 16px 0 0; }
.jobs-placeholder { padding: 48px 32px; color: var(--text3); font-size: 13px; }
/* Shared job status dot */
.job-dot {
width: 7px;
height: 7px;
border-radius: 50%;
flex-shrink: 0;
display: inline-block;
}
.job-dot--success { background: var(--accent); }
.job-dot--error { background: var(--red); }
.job-dot--running { background: var(--amber); animation: pulse 2s ease-in-out infinite; }
.job-dot--none { background: var(--border2); }
/* Run history list */
.run-item {
display: grid;
grid-template-columns: 10px 1fr 60px 1fr;
gap: 0 12px;
padding: 7px 0;
border-bottom: 1px solid var(--border);
font-size: 12px;
align-items: baseline;
}
.run-item:last-child { border-bottom: none; }
.run-time { color: var(--text3); }
.run-status { color: var(--text2); }
.run-result { color: var(--text); }
.run-empty { color: var(--text3); font-size: 12px; padding: 8px 0; }
/* Nav dot */
.nav-job-dot {
display: inline-block;
width: 6px;
height: 6px;
border-radius: 50%;
margin-left: 5px;
vertical-align: middle;
}
.nav-job-dot--success { background: var(--accent); }
.nav-job-dot--error { background: var(--red); }
.nav-job-dot--none { display: none; }

View File

@@ -22,6 +22,7 @@
<span class="nav-divider">·</span> <span class="nav-divider">·</span>
<span id="nav-version"></span> <span id="nav-version"></span>
</div> </div>
<button class="nav-btn" onclick="navigate('jobs')">Jobs <span id="nav-jobs-dot" class="nav-job-dot nav-job-dot--none"></span></button>
<button class="nav-btn" onclick="openSettingsModal()" title="Settings">&#9881;</button> <button class="nav-btn" onclick="openSettingsModal()" title="Settings">&#9881;</button>
</nav> </nav>
@@ -69,7 +70,6 @@
<div class="detail-name" id="detail-name"></div> <div class="detail-name" id="detail-name"></div>
<div class="detail-sub"> <div class="detail-sub">
<span><span class="lbl">vmid</span> <span class="val" id="detail-vmid-sub"></span></span> <span><span class="lbl">vmid</span> <span class="val" id="detail-vmid-sub"></span></span>
<span><span class="lbl">id</span> <span class="val" id="detail-id-sub"></span></span>
<span><span class="lbl">created</span> <span class="val" id="detail-created-sub"></span></span> <span><span class="lbl">created</span> <span class="val" id="detail-created-sub"></span></span>
</div> </div>
</div> </div>
@@ -98,6 +98,19 @@
</div> </div>
</div> </div>
</div> </div>
<!-- JOBS PAGE -->
<div class="page" id="page-jobs">
<div class="jobs-layout">
<div class="jobs-sidebar">
<div class="jobs-sidebar-title">Jobs</div>
<div id="jobs-list"></div>
</div>
<div class="jobs-detail" id="jobs-detail">
<div class="jobs-placeholder">Select a job</div>
</div>
</div>
</div>
</main> </main>
</div> </div>

View File

@@ -11,12 +11,19 @@ function navigate(page, vmid) {
document.getElementById('page-detail').classList.add('active'); document.getElementById('page-detail').classList.add('active');
history.pushState({ page: 'instance', vmid }, '', `/instance/${vmid}`); history.pushState({ page: 'instance', vmid }, '', `/instance/${vmid}`);
renderDetailPage(vmid); renderDetailPage(vmid);
} else if (page === 'jobs') {
document.getElementById('page-jobs').classList.add('active');
history.pushState({ page: 'jobs' }, '', '/jobs');
renderJobsPage();
} }
} }
function handleRoute() { function handleRoute() {
const m = window.location.pathname.match(/^\/instance\/(\d+)/); const m = window.location.pathname.match(/^\/instance\/(\d+)/);
if (m) { if (window.location.pathname === '/jobs') {
document.getElementById('page-jobs').classList.add('active');
renderJobsPage();
} else if (m) {
document.getElementById('page-detail').classList.add('active'); document.getElementById('page-detail').classList.add('active');
renderDetailPage(parseInt(m[1], 10)); renderDetailPage(parseInt(m[1], 10));
} else { } else {
@@ -30,6 +37,9 @@ window.addEventListener('popstate', e => {
if (e.state?.page === 'instance') { if (e.state?.page === 'instance') {
document.getElementById('page-detail').classList.add('active'); document.getElementById('page-detail').classList.add('active');
renderDetailPage(e.state.vmid); renderDetailPage(e.state.vmid);
} else if (e.state?.page === 'jobs') {
document.getElementById('page-jobs').classList.add('active');
renderJobsPage();
} else { } else {
document.getElementById('page-dashboard').classList.add('active'); document.getElementById('page-dashboard').classList.add('active');
renderDashboard(); renderDashboard();
@@ -43,4 +53,6 @@ if (VERSION) {
document.getElementById('nav-version').textContent = label; document.getElementById('nav-version').textContent = label;
} }
fetch('/api/jobs').then(r => r.json()).then(_updateJobsNavDot).catch(() => {});
handleRoute(); handleRoute();

190
js/ui.js
View File

@@ -71,10 +71,10 @@ async function renderDashboard() {
all.forEach(i => { states[i.state] = (states[i.state] || 0) + 1; }); all.forEach(i => { states[i.state] = (states[i.state] || 0) + 1; });
document.getElementById('stats-bar').innerHTML = ` document.getElementById('stats-bar').innerHTML = `
<div class="stat-cell"><div class="stat-label">total</div><div class="stat-value accent">${all.length}</div></div> <div class="stat-cell stat-clickable" onclick="setStateFilter('')"><div class="stat-label">total</div><div class="stat-value accent">${all.length}</div></div>
<div class="stat-cell"><div class="stat-label">deployed</div><div class="stat-value">${states['deployed'] || 0}</div></div> <div class="stat-cell stat-clickable" onclick="setStateFilter('deployed')"><div class="stat-label">deployed</div><div class="stat-value">${states['deployed'] || 0}</div></div>
<div class="stat-cell"><div class="stat-label">testing</div><div class="stat-value amber">${states['testing'] || 0}</div></div> <div class="stat-cell stat-clickable" onclick="setStateFilter('testing')"><div class="stat-label">testing</div><div class="stat-value amber">${states['testing'] || 0}</div></div>
<div class="stat-cell"><div class="stat-label">degraded</div><div class="stat-value red">${states['degraded'] || 0}</div></div> <div class="stat-cell stat-clickable" onclick="setStateFilter('degraded')"><div class="stat-label">degraded</div><div class="stat-value red">${states['degraded'] || 0}</div></div>
`; `;
await populateStackFilter(); await populateStackFilter();
@@ -95,6 +95,11 @@ async function populateStackFilter() {
}); });
} }
function setStateFilter(state) {
document.getElementById('filter-state').value = state;
filterInstances();
}
async function filterInstances() { async function filterInstances() {
const search = document.getElementById('search-input').value; const search = document.getElementById('search-input').value;
const state = document.getElementById('filter-state').value; const state = document.getElementById('filter-state').value;
@@ -172,7 +177,6 @@ async function renderDetailPage(vmid) {
document.getElementById('detail-vmid-crumb').textContent = vmid; document.getElementById('detail-vmid-crumb').textContent = vmid;
document.getElementById('detail-name').textContent = inst.name; document.getElementById('detail-name').textContent = inst.name;
document.getElementById('detail-vmid-sub').textContent = inst.vmid; document.getElementById('detail-vmid-sub').textContent = inst.vmid;
document.getElementById('detail-id-sub').textContent = inst.id;
document.getElementById('detail-created-sub').textContent = fmtDate(inst.created_at); document.getElementById('detail-created-sub').textContent = fmtDate(inst.created_at);
document.getElementById('detail-identity').innerHTML = ` document.getElementById('detail-identity').innerHTML = `
@@ -180,7 +184,6 @@ async function renderDetailPage(vmid) {
<div class="kv-row"><span class="kv-key">state</span><span class="kv-val"><span class="badge ${esc(inst.state)}">${esc(inst.state)}</span></span></div> <div class="kv-row"><span class="kv-key">state</span><span class="kv-val"><span class="badge ${esc(inst.state)}">${esc(inst.state)}</span></span></div>
<div class="kv-row"><span class="kv-key">stack</span><span class="kv-val"><span class="badge ${esc(inst.stack)}">${esc(inst.stack) || '—'}</span></span></div> <div class="kv-row"><span class="kv-key">stack</span><span class="kv-val"><span class="badge ${esc(inst.stack)}">${esc(inst.stack) || '—'}</span></span></div>
<div class="kv-row"><span class="kv-key">vmid</span><span class="kv-val highlight">${inst.vmid}</span></div> <div class="kv-row"><span class="kv-key">vmid</span><span class="kv-val highlight">${inst.vmid}</span></div>
<div class="kv-row"><span class="kv-key">internal id</span><span class="kv-val">${inst.id}</span></div>
`; `;
document.getElementById('detail-network').innerHTML = ` document.getElementById('detail-network').innerHTML = `
@@ -291,6 +294,10 @@ async function saveInstance() {
hardware_acceleration: +document.getElementById('f-hardware-accel').checked, hardware_acceleration: +document.getElementById('f-hardware-accel').checked,
}; };
// Snapshot job state before creation — jobs fire immediately after the 201
// so the baseline must be captured before the POST, not after.
const jobBaseline = !editingVmid ? await _snapshotJobBaseline() : null;
const result = editingVmid const result = editingVmid
? await updateInstance(editingVmid, data) ? await updateInstance(editingVmid, data)
: await createInstance(data); : await createInstance(data);
@@ -300,6 +307,8 @@ async function saveInstance() {
showToast(`${name} ${editingVmid ? 'updated' : 'created'}`, 'success'); showToast(`${name} ${editingVmid ? 'updated' : 'created'}`, 'success');
closeModal(); closeModal();
if (jobBaseline) await _waitForOnCreateJobs(jobBaseline);
if (currentVmid && document.getElementById('page-detail').classList.contains('active')) { if (currentVmid && document.getElementById('page-detail').classList.contains('active')) {
await renderDetailPage(vmid); await renderDetailPage(vmid);
} else { } else {
@@ -307,6 +316,30 @@ async function saveInstance() {
} }
} }
async function _snapshotJobBaseline() {
const jobs = await fetch('/api/jobs').then(r => r.json());
return new Map(jobs.map(j => [j.id, j.last_run_id ?? null]));
}
async function _waitForOnCreateJobs(baseline) {
const jobs = await fetch('/api/jobs').then(r => r.json());
const relevant = jobs.filter(j => (j.config ?? {}).run_on_create);
if (!relevant.length) return;
const deadline = Date.now() + 30_000;
while (Date.now() < deadline) {
await new Promise(r => setTimeout(r, 500));
const current = await fetch('/api/jobs').then(r => r.json());
const allDone = relevant.every(j => {
const cur = current.find(c => c.id === j.id);
if (!cur) return true;
if (cur.last_run_id === baseline.get(j.id)) return false; // new run not started yet
return cur.last_status !== 'running'; // new run complete
});
if (allDone) return;
}
}
// ── Confirm Dialog ──────────────────────────────────────────────────────────── // ── Confirm Dialog ────────────────────────────────────────────────────────────
function confirmDeleteDialog(inst) { function confirmDeleteDialog(inst) {
@@ -384,15 +417,17 @@ async function importDB() {
document.getElementById('confirm-ok').onclick = async () => { document.getElementById('confirm-ok').onclick = async () => {
closeConfirm(); closeConfirm();
try { try {
const { instances } = JSON.parse(await file.text()); const { instances, history = [], jobs, job_runs } = JSON.parse(await file.text());
const res = await fetch('/api/import', { const res = await fetch('/api/import', {
method: 'POST', method: 'POST',
headers: { 'Content-Type': 'application/json' }, headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({ instances }), body: JSON.stringify({ instances, history, jobs, job_runs }),
}); });
const data = await res.json(); const data = await res.json();
if (!res.ok) { showToast(data.error ?? 'Import failed', 'error'); return; } if (!res.ok) { showToast(data.error ?? 'Import failed', 'error'); return; }
showToast(`Imported ${data.imported} instance${data.imported !== 1 ? 's' : ''}`, 'success'); const parts = [`${data.imported} instance${data.imported !== 1 ? 's' : ''}`];
if (data.imported_jobs != null) parts.push(`${data.imported_jobs} job${data.imported_jobs !== 1 ? 's' : ''}`);
showToast(`Imported ${parts.join(', ')}`, 'success');
closeSettingsModal(); closeSettingsModal();
renderDashboard(); renderDashboard();
} catch { } catch {
@@ -426,3 +461,140 @@ document.getElementById('tz-select').addEventListener('change', e => {
if (m) renderDetailPage(parseInt(m[1], 10)); if (m) renderDetailPage(parseInt(m[1], 10));
else renderDashboard(); else renderDashboard();
}); });
// ── Jobs Page ─────────────────────────────────────────────────────────────────
async function renderJobsPage() {
const jobs = await fetch('/api/jobs').then(r => r.json());
_updateJobsNavDot(jobs);
document.getElementById('jobs-list').innerHTML = jobs.length
? jobs.map(j => `
<div class="job-item" id="job-item-${j.id}" onclick="loadJobDetail(${j.id})">
<span class="job-dot job-dot--${j.last_status ?? 'none'}"></span>
<span class="job-item-name">${esc(j.name)}</span>
</div>`).join('')
: '<div class="jobs-placeholder">No jobs</div>';
if (jobs.length) loadJobDetail(jobs[0].id);
}
async function loadJobDetail(jobId) {
document.querySelectorAll('.job-item').forEach(el => el.classList.remove('active'));
document.getElementById(`job-item-${jobId}`)?.classList.add('active');
const job = await fetch(`/api/jobs/${jobId}`).then(r => r.json());
const cfg = job.config ?? {};
document.getElementById('jobs-detail').innerHTML = `
<div class="jobs-detail-hd">
<div class="jobs-detail-title">${esc(job.name)}</div>
<div class="jobs-detail-desc">${esc(job.description)}</div>
</div>
<div class="form-group">
<label class="form-label" style="display:flex;align-items:center;gap:8px;cursor:pointer">
<input type="checkbox" id="job-enabled" ${job.enabled ? 'checked' : ''}
style="accent-color:var(--accent);width:13px;height:13px">
Enable scheduled runs
</label>
</div>
<div class="form-group">
<label class="form-label" for="job-schedule">Poll interval (minutes)</label>
<input class="form-input" id="job-schedule" type="number" min="1" value="${job.schedule}" style="max-width:100px">
</div>
<div class="form-group">
<label class="form-label" style="display:flex;align-items:center;gap:8px;cursor:pointer">
<input type="checkbox" id="job-run-on-create" ${cfg.run_on_create ? 'checked' : ''}
style="accent-color:var(--accent);width:13px;height:13px">
Run on instance creation
</label>
</div>
${_renderJobConfigFields(job.key, cfg)}
<div class="job-actions">
<button class="btn btn-secondary" onclick="saveJobDetail(${job.id})">Save</button>
<button class="btn btn-secondary" id="job-run-btn" onclick="runJobNow(${job.id})">Run Now</button>
</div>
<div class="detail-section-title" style="margin:28px 0 10px">Run History</div>
${_renderRunList(job.runs)}
`;
}
function _renderJobConfigFields(key, cfg) {
if (key === 'tailscale_sync') return `
<div class="form-group">
<label class="form-label" for="job-cfg-tailnet">Tailnet</label>
<input class="form-input" id="job-cfg-tailnet" type="text"
placeholder="e.g. Tt3Btpm6D921CNTRL" value="${esc(cfg.tailnet ?? '')}">
</div>
<div class="form-group">
<label class="form-label" for="job-cfg-api-key">API Key</label>
<input class="form-input" id="job-cfg-api-key" type="password"
placeholder="tskey-api-…" value="${esc(cfg.api_key ?? '')}">
</div>`;
if (key === 'patchmon_sync' || key === 'semaphore_sync') {
const label = key === 'semaphore_sync' ? 'API Token (Bearer)' : 'API Token (Basic)';
return `
<div class="form-group">
<label class="form-label" for="job-cfg-api-url">API URL</label>
<input class="form-input" id="job-cfg-api-url" type="text"
value="${esc(cfg.api_url ?? '')}">
</div>
<div class="form-group">
<label class="form-label" for="job-cfg-api-token">${label}</label>
<input class="form-input" id="job-cfg-api-token" type="password"
value="${esc(cfg.api_token ?? '')}">
</div>`;
}
return '';
}
function _renderRunList(runs) {
if (!runs?.length) return '<div class="run-empty">No runs yet</div>';
return `<div class="run-list">${runs.map(r => `
<div class="run-item">
<span class="job-dot job-dot--${r.status}"></span>
<span class="run-time">${fmtDateFull(r.started_at)}</span>
<span class="run-status">${esc(r.status)}</span>
<span class="run-result">${esc(r.result)}</span>
</div>`).join('')}</div>`;
}
async function saveJobDetail(jobId) {
const enabled = document.getElementById('job-enabled').checked;
const schedule = document.getElementById('job-schedule').value;
const cfg = {};
const tailnet = document.getElementById('job-cfg-tailnet');
const apiKey = document.getElementById('job-cfg-api-key');
const apiUrl = document.getElementById('job-cfg-api-url');
const apiToken = document.getElementById('job-cfg-api-token');
if (tailnet) cfg.tailnet = tailnet.value.trim();
if (apiKey) cfg.api_key = apiKey.value;
if (apiUrl) cfg.api_url = apiUrl.value.trim();
if (apiToken) cfg.api_token = apiToken.value;
const runOnCreate = document.getElementById('job-run-on-create');
if (runOnCreate) cfg.run_on_create = runOnCreate.checked;
const res = await fetch(`/api/jobs/${jobId}`, {
method: 'PUT',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({ enabled, schedule: parseInt(schedule, 10), config: cfg }),
});
if (res.ok) { showToast('Job saved', 'success'); loadJobDetail(jobId); }
else { showToast('Failed to save', 'error'); }
}
async function runJobNow(jobId) {
const btn = document.getElementById('job-run-btn');
btn.disabled = true;
btn.textContent = 'Running…';
try {
const res = await fetch(`/api/jobs/${jobId}/run`, { method: 'POST' });
const data = await res.json();
if (res.ok) { showToast(`Done — ${data.summary}`, 'success'); loadJobDetail(jobId); }
else { showToast(data.error ?? 'Run failed', 'error'); }
} catch { showToast('Run failed', 'error'); }
finally { btn.disabled = false; btn.textContent = 'Run Now'; }
}
function _updateJobsNavDot(jobs) {
const dot = document.getElementById('nav-jobs-dot');
const cls = jobs.some(j => j.last_status === 'error') ? 'error'
: jobs.some(j => j.last_status === 'success') ? 'success'
: 'none';
dot.className = `nav-job-dot nav-job-dot--${cls}`;
}

View File

@@ -1 +1 @@
const VERSION = "1.3.1"; const VERSION = "1.5.0";

View File

@@ -1,6 +1,6 @@
{ {
"name": "catalyst", "name": "catalyst",
"version": "1.3.1", "version": "1.6.0",
"type": "module", "type": "module",
"scripts": { "scripts": {
"start": "node server/server.js", "start": "node server/server.js",

View File

@@ -17,7 +17,7 @@ function init(path) {
db.exec('PRAGMA foreign_keys = ON'); db.exec('PRAGMA foreign_keys = ON');
db.exec('PRAGMA synchronous = NORMAL'); db.exec('PRAGMA synchronous = NORMAL');
createSchema(); createSchema();
if (path !== ':memory:') seed(); if (path !== ':memory:') { seed(); seedJobs(); }
} }
function createSchema() { function createSchema() {
@@ -53,6 +53,31 @@ function createSchema() {
changed_at TEXT NOT NULL DEFAULT (datetime('now')) changed_at TEXT NOT NULL DEFAULT (datetime('now'))
); );
CREATE INDEX IF NOT EXISTS idx_history_vmid ON instance_history(vmid); CREATE INDEX IF NOT EXISTS idx_history_vmid ON instance_history(vmid);
CREATE TABLE IF NOT EXISTS config (
key TEXT PRIMARY KEY,
value TEXT NOT NULL DEFAULT ''
);
CREATE TABLE IF NOT EXISTS jobs (
id INTEGER PRIMARY KEY AUTOINCREMENT,
key TEXT NOT NULL UNIQUE,
name TEXT NOT NULL,
description TEXT NOT NULL DEFAULT '',
enabled INTEGER NOT NULL DEFAULT 0 CHECK(enabled IN (0,1)),
schedule INTEGER NOT NULL DEFAULT 15,
config TEXT NOT NULL DEFAULT '{}'
);
CREATE TABLE IF NOT EXISTS job_runs (
id INTEGER PRIMARY KEY AUTOINCREMENT,
job_id INTEGER NOT NULL,
started_at TEXT NOT NULL DEFAULT (datetime('now')),
ended_at TEXT,
status TEXT NOT NULL DEFAULT 'running' CHECK(status IN ('running','success','error')),
result TEXT NOT NULL DEFAULT ''
);
CREATE INDEX IF NOT EXISTS idx_job_runs_job_id ON job_runs(job_id);
`); `);
} }
@@ -83,6 +108,29 @@ function seed() {
db.exec('COMMIT'); db.exec('COMMIT');
} }
function seedJobs() {
const upsert = db.prepare(`
INSERT OR IGNORE INTO jobs (key, name, description, enabled, schedule, config)
VALUES (?, ?, ?, ?, ?, ?)
`);
const apiKey = getConfig('tailscale_api_key');
const tailnet = getConfig('tailscale_tailnet');
const tsSchedule = parseInt(getConfig('tailscale_poll_minutes', '15'), 10) || 15;
const tsEnabled = getConfig('tailscale_enabled') === '1' ? 1 : 0;
upsert.run('tailscale_sync', 'Tailscale Sync',
'Syncs Tailscale device status and IPs to instances by matching hostnames.',
tsEnabled, tsSchedule, JSON.stringify({ api_key: apiKey, tailnet }));
upsert.run('patchmon_sync', 'Patchmon Sync',
'Syncs Patchmon host registration status to instances by matching hostnames.',
0, 60, JSON.stringify({ api_url: 'http://patchmon:3000/api/v1/api/hosts', api_token: '' }));
upsert.run('semaphore_sync', 'Semaphore Sync',
'Syncs Semaphore inventory membership to instances by matching hostnames.',
0, 60, JSON.stringify({ api_url: 'http://semaphore:3000/api/project/1/inventory/1', api_token: '' }));
}
// ── Queries ─────────────────────────────────────────────────────────────────── // ── Queries ───────────────────────────────────────────────────────────────────
export function getInstances(filters = {}) { export function getInstances(filters = {}) {
@@ -125,7 +173,8 @@ export function createInstance(data) {
@tailscale, @andromeda, @tailscale_ip, @hardware_acceleration) @tailscale, @andromeda, @tailscale_ip, @hardware_acceleration)
`).run(data); `).run(data);
db.prepare( db.prepare(
`INSERT INTO instance_history (vmid, field, old_value, new_value) VALUES (?, 'created', NULL, NULL)` `INSERT INTO instance_history (vmid, field, old_value, new_value, changed_at)
VALUES (?, 'created', NULL, NULL, strftime('%Y-%m-%dT%H:%M:%f', 'now'))`
).run(data.vmid); ).run(data.vmid);
} }
@@ -136,12 +185,13 @@ export function updateInstance(vmid, data) {
name=@name, state=@state, stack=@stack, vmid=@newVmid, name=@name, state=@state, stack=@stack, vmid=@newVmid,
atlas=@atlas, argus=@argus, semaphore=@semaphore, patchmon=@patchmon, atlas=@atlas, argus=@argus, semaphore=@semaphore, patchmon=@patchmon,
tailscale=@tailscale, andromeda=@andromeda, tailscale_ip=@tailscale_ip, tailscale=@tailscale, andromeda=@andromeda, tailscale_ip=@tailscale_ip,
hardware_acceleration=@hardware_acceleration, updated_at=datetime('now') hardware_acceleration=@hardware_acceleration, updated_at=strftime('%Y-%m-%dT%H:%M:%f', 'now')
WHERE vmid=@vmid WHERE vmid=@vmid
`).run({ ...data, newVmid: data.vmid, vmid }); `).run({ ...data, newVmid: data.vmid, vmid });
const newVmid = data.vmid; const newVmid = data.vmid;
const insertEvt = db.prepare( const insertEvt = db.prepare(
`INSERT INTO instance_history (vmid, field, old_value, new_value) VALUES (?, ?, ?, ?)` `INSERT INTO instance_history (vmid, field, old_value, new_value, changed_at)
VALUES (?, ?, ?, ?, strftime('%Y-%m-%dT%H:%M:%f', 'now'))`
); );
for (const field of HISTORY_FIELDS) { for (const field of HISTORY_FIELDS) {
const oldVal = String(old[field] ?? ''); const oldVal = String(old[field] ?? '');
@@ -155,7 +205,7 @@ export function deleteInstance(vmid) {
db.prepare('DELETE FROM instances WHERE vmid = ?').run(vmid); db.prepare('DELETE FROM instances WHERE vmid = ?').run(vmid);
} }
export function importInstances(rows) { export function importInstances(rows, historyRows = []) {
db.exec('BEGIN'); db.exec('BEGIN');
db.exec('DELETE FROM instance_history'); db.exec('DELETE FROM instance_history');
db.exec('DELETE FROM instances'); db.exec('DELETE FROM instances');
@@ -168,15 +218,114 @@ export function importInstances(rows) {
@tailscale, @andromeda, @tailscale_ip, @hardware_acceleration) @tailscale, @andromeda, @tailscale_ip, @hardware_acceleration)
`); `);
for (const row of rows) insert.run(row); for (const row of rows) insert.run(row);
if (historyRows.length) {
const insertHist = db.prepare(
`INSERT INTO instance_history (vmid, field, old_value, new_value, changed_at) VALUES (?, ?, ?, ?, ?)`
);
for (const h of historyRows) insertHist.run(h.vmid, h.field, h.old_value ?? null, h.new_value ?? null, h.changed_at);
}
db.exec('COMMIT'); db.exec('COMMIT');
} }
export function getInstanceHistory(vmid) { export function getInstanceHistory(vmid) {
return db.prepare( return db.prepare(
'SELECT * FROM instance_history WHERE vmid = ? ORDER BY changed_at DESC' 'SELECT * FROM instance_history WHERE vmid = ? ORDER BY changed_at DESC, id DESC'
).all(vmid); ).all(vmid);
} }
export function getAllHistory() {
return db.prepare('SELECT * FROM instance_history ORDER BY vmid, changed_at').all();
}
export function getAllJobs() {
return db.prepare('SELECT id, key, name, description, enabled, schedule, config FROM jobs ORDER BY id').all();
}
export function getAllJobRuns() {
return db.prepare('SELECT * FROM job_runs ORDER BY job_id, id').all();
}
export function importJobs(jobRows, jobRunRows = []) {
db.exec('BEGIN');
db.exec('DELETE FROM job_runs');
db.exec('DELETE FROM jobs');
const insertJob = db.prepare(`
INSERT INTO jobs (id, key, name, description, enabled, schedule, config)
VALUES (@id, @key, @name, @description, @enabled, @schedule, @config)
`);
for (const j of jobRows) insertJob.run(j);
if (jobRunRows.length) {
const insertRun = db.prepare(`
INSERT INTO job_runs (id, job_id, started_at, ended_at, status, result)
VALUES (@id, @job_id, @started_at, @ended_at, @status, @result)
`);
for (const r of jobRunRows) insertRun.run(r);
}
db.exec('COMMIT');
}
export function getConfig(key, defaultVal = '') {
const row = db.prepare('SELECT value FROM config WHERE key = ?').get(key);
return row ? row.value : defaultVal;
}
export function setConfig(key, value) {
db.prepare(
`INSERT INTO config (key, value) VALUES (?, ?)
ON CONFLICT(key) DO UPDATE SET value = excluded.value`
).run(key, String(value));
}
// ── Jobs ──────────────────────────────────────────────────────────────────────
const JOB_WITH_LAST_RUN = `
SELECT j.*,
r.id AS last_run_id,
r.started_at AS last_run_at,
r.status AS last_status,
r.result AS last_result
FROM jobs j
LEFT JOIN job_runs r
ON r.id = (SELECT id FROM job_runs WHERE job_id = j.id ORDER BY id DESC LIMIT 1)
`;
export function getJobs() {
return db.prepare(JOB_WITH_LAST_RUN + ' ORDER BY j.id').all();
}
export function getJob(id) {
return db.prepare(JOB_WITH_LAST_RUN + ' WHERE j.id = ?').get(id) ?? null;
}
export function createJob(data) {
db.prepare(`
INSERT INTO jobs (key, name, description, enabled, schedule, config)
VALUES (@key, @name, @description, @enabled, @schedule, @config)
`).run(data);
}
export function updateJob(id, { enabled, schedule, config }) {
db.prepare(`
UPDATE jobs SET enabled=@enabled, schedule=@schedule, config=@config WHERE id=@id
`).run({ id, enabled, schedule, config });
}
export function createJobRun(jobId) {
return Number(db.prepare(
`INSERT INTO job_runs (job_id, started_at) VALUES (?, strftime('%Y-%m-%dT%H:%M:%f', 'now'))`
).run(jobId).lastInsertRowid);
}
export function completeJobRun(runId, status, result) {
db.prepare(`
UPDATE job_runs SET ended_at=strftime('%Y-%m-%dT%H:%M:%f', 'now'), status=@status, result=@result WHERE id=@id
`).run({ id: runId, status, result });
}
export function getJobRuns(jobId) {
return db.prepare('SELECT * FROM job_runs WHERE job_id = ? ORDER BY id DESC').all(jobId);
}
// ── Test helpers ────────────────────────────────────────────────────────────── // ── Test helpers ──────────────────────────────────────────────────────────────
export function _resetForTest() { export function _resetForTest() {

150
server/jobs.js Normal file
View File

@@ -0,0 +1,150 @@
import { getJobs, getJob, getInstances, updateInstance, createJobRun, completeJobRun } from './db.js';
// ── Handlers ──────────────────────────────────────────────────────────────────
const TAILSCALE_API = 'https://api.tailscale.com/api/v2';
async function tailscaleSyncHandler(cfg) {
const { api_key, tailnet } = cfg;
if (!api_key || !tailnet) throw new Error('Tailscale not configured — set API key and tailnet');
const res = await fetch(
`${TAILSCALE_API}/tailnet/${encodeURIComponent(tailnet)}/devices`,
{ headers: { Authorization: `Bearer ${api_key}` } }
);
if (!res.ok) throw new Error(`Tailscale API ${res.status}`);
const { devices } = await res.json();
const tsMap = new Map(
devices.map(d => [d.hostname, (d.addresses ?? []).find(a => a.startsWith('100.')) ?? ''])
);
const instances = getInstances();
let updated = 0;
for (const inst of instances) {
const tsIp = tsMap.get(inst.name);
const matched = tsIp !== undefined;
const newTailscale = matched ? 1 : (inst.tailscale === 1 ? 0 : inst.tailscale);
const newIp = matched ? tsIp : (inst.tailscale === 1 ? '' : inst.tailscale_ip);
if (newTailscale !== inst.tailscale || newIp !== inst.tailscale_ip) {
const { id: _id, created_at: _ca, updated_at: _ua, ...instData } = inst;
updateInstance(inst.vmid, { ...instData, tailscale: newTailscale, tailscale_ip: newIp });
updated++;
}
}
return { summary: `${updated} updated of ${instances.length}` };
}
// ── Patchmon Sync ─────────────────────────────────────────────────────────────
async function patchmonSyncHandler(cfg) {
const { api_url, api_token } = cfg;
if (!api_url || !api_token) throw new Error('Patchmon not configured — set API URL and token');
const res = await fetch(api_url, {
headers: { Authorization: `Basic ${api_token}` },
});
if (!res.ok) throw new Error(`Patchmon API ${res.status}`);
const data = await res.json();
const items = Array.isArray(data) ? data : (data.hosts ?? data.data ?? []);
const hostSet = new Set(
items.map(h => (typeof h === 'string' ? h : (h.name ?? h.hostname ?? h.host ?? '')))
.filter(Boolean)
);
const instances = getInstances();
let updated = 0;
for (const inst of instances) {
const newPatchmon = hostSet.has(inst.name) ? 1 : 0;
if (newPatchmon !== inst.patchmon) {
const { id: _id, created_at: _ca, updated_at: _ua, ...instData } = inst;
updateInstance(inst.vmid, { ...instData, patchmon: newPatchmon });
updated++;
}
}
return { summary: `${updated} updated of ${instances.length}` };
}
// ── Semaphore Sync ────────────────────────────────────────────────────────────
async function semaphoreSyncHandler(cfg) {
const { api_url, api_token } = cfg;
if (!api_url || !api_token) throw new Error('Semaphore not configured — set API URL and token');
const res = await fetch(api_url, {
headers: { Authorization: `Bearer ${api_token}` },
});
if (!res.ok) throw new Error(`Semaphore API ${res.status}`);
const data = await res.json();
// Inventory is an Ansible INI string; extract bare hostnames
const hostSet = new Set(
(data.inventory ?? '').split('\n')
.map(l => l.trim())
.filter(l => l && !l.startsWith('[') && !l.startsWith('#') && !l.startsWith(';'))
.map(l => l.split(/[\s=]/)[0])
.filter(Boolean)
);
const instances = getInstances();
let updated = 0;
for (const inst of instances) {
const newSemaphore = hostSet.has(inst.name) ? 1 : 0;
if (newSemaphore !== inst.semaphore) {
const { id: _id, created_at: _ca, updated_at: _ua, ...instData } = inst;
updateInstance(inst.vmid, { ...instData, semaphore: newSemaphore });
updated++;
}
}
return { summary: `${updated} updated of ${instances.length}` };
}
// ── Registry ──────────────────────────────────────────────────────────────────
const HANDLERS = {
tailscale_sync: tailscaleSyncHandler,
patchmon_sync: patchmonSyncHandler,
semaphore_sync: semaphoreSyncHandler,
};
// ── Public API ────────────────────────────────────────────────────────────────
export async function runJob(jobId) {
const job = getJob(jobId);
if (!job) throw new Error('Job not found');
const handler = HANDLERS[job.key];
if (!handler) throw new Error(`No handler for '${job.key}'`);
const cfg = JSON.parse(job.config || '{}');
const runId = createJobRun(jobId);
try {
const result = await handler(cfg);
completeJobRun(runId, 'success', result.summary ?? '');
return result;
} catch (e) {
completeJobRun(runId, 'error', e.message);
throw e;
}
}
const _intervals = new Map();
export async function runJobsOnCreate() {
for (const job of getJobs()) {
const cfg = JSON.parse(job.config || '{}');
if (cfg.run_on_create) {
try { await runJob(job.id); } catch (e) { console.error(`runJobsOnCreate job ${job.id}:`, e); }
}
}
}
export function restartJobs() {
for (const iv of _intervals.values()) clearInterval(iv);
_intervals.clear();
for (const job of getJobs()) {
if (!job.enabled) continue;
const ms = Math.max(1, job.schedule || 15) * 60_000;
const id = job.id;
_intervals.set(id, setInterval(() => runJob(id).catch(() => {}), ms));
}
}

View File

@@ -1,8 +1,11 @@
import { Router } from 'express'; import { Router } from 'express';
import { import {
getInstances, getInstance, getDistinctStacks, getInstances, getInstance, getDistinctStacks,
createInstance, updateInstance, deleteInstance, importInstances, getInstanceHistory, createInstance, updateInstance, deleteInstance, importInstances, getInstanceHistory, getAllHistory,
getConfig, setConfig, getJobs, getJob, updateJob, getJobRuns,
getAllJobs, getAllJobRuns, importJobs,
} from './db.js'; } from './db.js';
import { runJob, restartJobs, runJobsOnCreate } from './jobs.js';
export const router = Router(); export const router = Router();
@@ -12,6 +15,15 @@ const VALID_STATES = ['deployed', 'testing', 'degraded'];
const VALID_STACKS = ['production', 'development']; const VALID_STACKS = ['production', 'development'];
const SERVICE_KEYS = ['atlas', 'argus', 'semaphore', 'patchmon', 'tailscale', 'andromeda']; const SERVICE_KEYS = ['atlas', 'argus', 'semaphore', 'patchmon', 'tailscale', 'andromeda'];
const REDACTED = '**REDACTED**';
function maskJob(job) {
const cfg = JSON.parse(job.config || '{}');
if (cfg.api_key) cfg.api_key = REDACTED;
if (cfg.api_token) cfg.api_token = REDACTED;
return { ...job, config: cfg };
}
function validate(body) { function validate(body) {
const errors = []; const errors = [];
if (!body.name || typeof body.name !== 'string' || !body.name.trim()) if (!body.name || typeof body.name !== 'string' || !body.name.trim())
@@ -90,6 +102,7 @@ router.post('/instances', (req, res) => {
createInstance(data); createInstance(data);
const created = getInstance(data.vmid); const created = getInstance(data.vmid);
res.status(201).json(created); res.status(201).json(created);
runJobsOnCreate().catch(() => {});
} catch (e) { } catch (e) {
handleDbError('POST /api/instances', e, res); handleDbError('POST /api/instances', e, res);
} }
@@ -116,14 +129,17 @@ router.put('/instances/:vmid', (req, res) => {
// GET /api/export // GET /api/export
router.get('/export', (_req, res) => { router.get('/export', (_req, res) => {
const instances = getInstances(); const instances = getInstances();
const history = getAllHistory();
const jobs = getAllJobs();
const job_runs = getAllJobRuns();
const date = new Date().toISOString().slice(0, 10); const date = new Date().toISOString().slice(0, 10);
res.setHeader('Content-Disposition', `attachment; filename="catalyst-backup-${date}.json"`); res.setHeader('Content-Disposition', `attachment; filename="catalyst-backup-${date}.json"`);
res.json({ version: 1, exported_at: new Date().toISOString(), instances }); res.json({ version: 3, exported_at: new Date().toISOString(), instances, history, jobs, job_runs });
}); });
// POST /api/import // POST /api/import
router.post('/import', (req, res) => { router.post('/import', (req, res) => {
const { instances } = req.body ?? {}; const { instances, history = [], jobs, job_runs } = req.body ?? {};
if (!Array.isArray(instances)) { if (!Array.isArray(instances)) {
return res.status(400).json({ error: 'body must contain an instances array' }); return res.status(400).json({ error: 'body must contain an instances array' });
} }
@@ -134,8 +150,15 @@ router.post('/import', (req, res) => {
} }
if (errors.length) return res.status(400).json({ errors }); if (errors.length) return res.status(400).json({ errors });
try { try {
importInstances(instances.map(normalise)); importInstances(instances.map(normalise), Array.isArray(history) ? history : []);
res.json({ imported: instances.length }); if (Array.isArray(jobs)) {
importJobs(jobs, Array.isArray(job_runs) ? job_runs : []);
try { restartJobs(); } catch (e) { console.error('POST /api/import restartJobs', e); }
}
res.json({
imported: instances.length,
imported_jobs: Array.isArray(jobs) ? jobs.length : undefined,
});
} catch (e) { } catch (e) {
console.error('POST /api/import', e); console.error('POST /api/import', e);
res.status(500).json({ error: 'internal server error' }); res.status(500).json({ error: 'internal server error' });
@@ -159,3 +182,49 @@ router.delete('/instances/:vmid', (req, res) => {
handleDbError('DELETE /api/instances/:vmid', e, res); handleDbError('DELETE /api/instances/:vmid', e, res);
} }
}); });
// GET /api/jobs
router.get('/jobs', (_req, res) => {
res.json(getJobs().map(maskJob));
});
// GET /api/jobs/:id
router.get('/jobs/:id', (req, res) => {
const id = parseInt(req.params.id, 10);
if (!id) return res.status(400).json({ error: 'invalid id' });
const job = getJob(id);
if (!job) return res.status(404).json({ error: 'job not found' });
res.json({ ...maskJob(job), runs: getJobRuns(id) });
});
// PUT /api/jobs/:id
router.put('/jobs/:id', (req, res) => {
const id = parseInt(req.params.id, 10);
if (!id) return res.status(400).json({ error: 'invalid id' });
const job = getJob(id);
if (!job) return res.status(404).json({ error: 'job not found' });
const { enabled, schedule, config: newCfg } = req.body ?? {};
const existingCfg = JSON.parse(job.config || '{}');
const mergedCfg = { ...existingCfg, ...(newCfg ?? {}) };
if (newCfg?.api_key === REDACTED) mergedCfg.api_key = existingCfg.api_key;
if (newCfg?.api_token === REDACTED) mergedCfg.api_token = existingCfg.api_token;
updateJob(id, {
enabled: enabled != null ? (enabled ? 1 : 0) : job.enabled,
schedule: schedule != null ? (parseInt(schedule, 10) || 15) : job.schedule,
config: JSON.stringify(mergedCfg),
});
try { restartJobs(); } catch (e) { console.error('PUT /api/jobs/:id restartJobs', e); }
res.json(maskJob(getJob(id)));
});
// POST /api/jobs/:id/run
router.post('/jobs/:id/run', async (req, res) => {
const id = parseInt(req.params.id, 10);
if (!id) return res.status(400).json({ error: 'invalid id' });
if (!getJob(id)) return res.status(404).json({ error: 'job not found' });
try {
res.json(await runJob(id));
} catch (e) {
handleDbError('POST /api/jobs/:id/run', e, res);
}
});

View File

@@ -3,6 +3,7 @@ import helmet from 'helmet';
import { fileURLToPath } from 'url'; import { fileURLToPath } from 'url';
import { dirname, join } from 'path'; import { dirname, join } from 'path';
import { router } from './routes.js'; import { router } from './routes.js';
import { restartJobs } from './jobs.js';
const __dirname = dirname(fileURLToPath(import.meta.url)); const __dirname = dirname(fileURLToPath(import.meta.url));
const PORT = process.env.PORT ?? 3000; const PORT = process.env.PORT ?? 3000;
@@ -47,5 +48,6 @@ app.use((err, _req, res, _next) => {
// Boot — only when run directly, not when imported by tests // Boot — only when run directly, not when imported by tests
if (process.argv[1] === fileURLToPath(import.meta.url)) { if (process.argv[1] === fileURLToPath(import.meta.url)) {
restartJobs();
app.listen(PORT, () => console.log(`catalyst on :${PORT}`)); app.listen(PORT, () => console.log(`catalyst on :${PORT}`));
} }

View File

@@ -1,7 +1,7 @@
import { describe, it, expect, beforeEach, afterEach, vi } from 'vitest' import { describe, it, expect, beforeEach, afterEach, vi } from 'vitest'
import request from 'supertest' import request from 'supertest'
import { app } from '../server/server.js' import { app } from '../server/server.js'
import { _resetForTest } from '../server/db.js' import { _resetForTest, createJob } from '../server/db.js'
import * as dbModule from '../server/db.js' import * as dbModule from '../server/db.js'
beforeEach(() => _resetForTest()) beforeEach(() => _resetForTest())
@@ -275,6 +275,33 @@ describe('GET /api/export', () => {
const res = await request(app).get('/api/export') const res = await request(app).get('/api/export')
expect(res.body.instances).toEqual([]) expect(res.body.instances).toEqual([])
}) })
it('returns version 3', async () => {
const res = await request(app).get('/api/export')
expect(res.body.version).toBe(3)
})
it('includes a history array', async () => {
await request(app).post('/api/instances').send(base)
const res = await request(app).get('/api/export')
expect(res.body.history).toBeInstanceOf(Array)
expect(res.body.history.some(e => e.field === 'created')).toBe(true)
})
it('includes jobs and job_runs arrays', async () => {
createJob(testJob)
const res = await request(app).get('/api/export')
expect(res.body.jobs).toBeInstanceOf(Array)
expect(res.body.jobs).toHaveLength(1)
expect(res.body.jobs[0].key).toBe('tailscale_sync')
expect(res.body.job_runs).toBeInstanceOf(Array)
})
it('exports raw job config without masking', async () => {
createJob(testJob)
const res = await request(app).get('/api/export')
expect(res.body.jobs[0].config).toContain('tskey-test')
})
}) })
// ── POST /api/import ────────────────────────────────────────────────────────── // ── POST /api/import ──────────────────────────────────────────────────────────
@@ -309,6 +336,48 @@ describe('POST /api/import', () => {
.send({ instances: [{ ...base, name: undefined, vmid: 1 }] }) .send({ instances: [{ ...base, name: undefined, vmid: 1 }] })
expect(res.status).toBe(400) expect(res.status).toBe(400)
}) })
it('restores history when history array is provided', async () => {
await request(app).post('/api/instances').send(base)
const exp = await request(app).get('/api/export')
await request(app).post('/api/instances').send({ ...base, vmid: 999, name: 'other' })
const res = await request(app).post('/api/import').send({
instances: exp.body.instances,
history: exp.body.history,
})
expect(res.status).toBe(200)
const hist = await request(app).get('/api/instances/100/history')
expect(hist.body.some(e => e.field === 'created')).toBe(true)
})
it('succeeds with a v1 backup that has no history key', async () => {
const res = await request(app).post('/api/import')
.send({ instances: [{ ...base, vmid: 1, name: 'legacy' }] })
expect(res.status).toBe(200)
expect(res.body.imported).toBe(1)
})
it('imports jobs and job_runs and returns imported_jobs count', async () => {
const exp = await request(app).get('/api/export')
createJob(testJob)
const fullExport = await request(app).get('/api/export')
const res = await request(app).post('/api/import').send({
instances: fullExport.body.instances,
history: fullExport.body.history,
jobs: fullExport.body.jobs,
job_runs: fullExport.body.job_runs,
})
expect(res.status).toBe(200)
expect(res.body.imported_jobs).toBe(1)
expect((await request(app).get('/api/jobs')).body).toHaveLength(1)
})
it('leaves jobs untouched when no jobs key in payload', async () => {
createJob(testJob)
await request(app).post('/api/import')
.send({ instances: [{ ...base, vmid: 1, name: 'x' }] })
expect((await request(app).get('/api/jobs')).body).toHaveLength(1)
})
}) })
// ── Static assets & SPA routing ─────────────────────────────────────────────── // ── Static assets & SPA routing ───────────────────────────────────────────────
@@ -421,3 +490,172 @@ describe('error handling — unexpected DB failures', () => {
) )
}) })
}) })
const testJob = {
key: 'tailscale_sync', name: 'Tailscale Sync', description: 'Test job',
enabled: 0, schedule: 15,
config: JSON.stringify({ api_key: 'tskey-test', tailnet: 'example.com' }),
}
const patchmonJob = {
key: 'patchmon_sync', name: 'Patchmon Sync', description: 'Test patchmon job',
enabled: 0, schedule: 60,
config: JSON.stringify({ api_url: 'http://patchmon:3000/api/v1/api/hosts', api_token: 'secret-token' }),
}
// ── GET /api/jobs ─────────────────────────────────────────────────────────────
describe('GET /api/jobs', () => {
it('returns empty array when no jobs', async () => {
const res = await request(app).get('/api/jobs')
expect(res.status).toBe(200)
expect(res.body).toEqual([])
})
it('returns jobs with masked api key', async () => {
createJob(testJob)
const res = await request(app).get('/api/jobs')
expect(res.body).toHaveLength(1)
expect(res.body[0].config.api_key).toBe('**REDACTED**')
})
it('returns jobs with masked api_token', async () => {
createJob(patchmonJob)
const res = await request(app).get('/api/jobs')
expect(res.body[0].config.api_token).toBe('**REDACTED**')
})
})
// ── GET /api/jobs/:id ─────────────────────────────────────────────────────────
describe('GET /api/jobs/:id', () => {
it('returns job with runs array', async () => {
createJob(testJob)
const id = (await request(app).get('/api/jobs')).body[0].id
const res = await request(app).get(`/api/jobs/${id}`)
expect(res.status).toBe(200)
expect(res.body.runs).toBeInstanceOf(Array)
})
it('returns 404 for unknown id', async () => {
expect((await request(app).get('/api/jobs/999')).status).toBe(404)
})
it('returns 400 for non-numeric id', async () => {
expect((await request(app).get('/api/jobs/abc')).status).toBe(400)
})
})
// ── PUT /api/jobs/:id ─────────────────────────────────────────────────────────
describe('PUT /api/jobs/:id', () => {
it('updates enabled and schedule', async () => {
createJob(testJob)
const id = (await request(app).get('/api/jobs')).body[0].id
const res = await request(app).put(`/api/jobs/${id}`).send({ enabled: true, schedule: 30 })
expect(res.status).toBe(200)
expect(res.body.enabled).toBe(1)
expect(res.body.schedule).toBe(30)
})
it('does not overwrite api_key when **REDACTED** is sent', async () => {
createJob(testJob)
const id = (await request(app).get('/api/jobs')).body[0].id
await request(app).put(`/api/jobs/${id}`).send({ config: { api_key: '**REDACTED**' } })
expect(dbModule.getJob(id).config).toContain('tskey-test')
})
it('returns 404 for unknown id', async () => {
expect((await request(app).put('/api/jobs/999').send({})).status).toBe(404)
})
})
// ── POST /api/jobs/:id/run ────────────────────────────────────────────────────
describe('POST /api/jobs/:id/run', () => {
afterEach(() => vi.unstubAllGlobals())
it('returns 404 for unknown id', async () => {
expect((await request(app).post('/api/jobs/999/run')).status).toBe(404)
})
it('runs job, returns summary, and logs the run', async () => {
createJob(testJob)
const id = (await request(app).get('/api/jobs')).body[0].id
vi.stubGlobal('fetch', vi.fn().mockResolvedValueOnce({
ok: true,
json: async () => ({ devices: [] }),
}))
const res = await request(app).post(`/api/jobs/${id}/run`)
expect(res.status).toBe(200)
expect(res.body.summary).toBeDefined()
const detail = await request(app).get(`/api/jobs/${id}`)
expect(detail.body.runs).toHaveLength(1)
expect(detail.body.runs[0].status).toBe('success')
})
it('logs error run on failure', async () => {
createJob(testJob)
const id = (await request(app).get('/api/jobs')).body[0].id
vi.stubGlobal('fetch', vi.fn().mockRejectedValueOnce(new Error('network error')))
const res = await request(app).post(`/api/jobs/${id}/run`)
expect(res.status).toBe(500)
const detail = await request(app).get(`/api/jobs/${id}`)
expect(detail.body.runs[0].status).toBe('error')
})
it('patchmon_sync: marks instances present in host list as patchmon=1', async () => {
createJob(patchmonJob)
const id = (await request(app).get('/api/jobs')).body[0].id
vi.stubGlobal('fetch', vi.fn().mockResolvedValueOnce({
ok: true,
json: async () => [{ name: 'plex' }, { name: 'traefik' }],
}))
const res = await request(app).post(`/api/jobs/${id}/run`)
expect(res.status).toBe(200)
expect(res.body.summary).toMatch(/updated of/)
})
it('patchmon_sync: returns 500 when API token is missing', async () => {
createJob({ ...patchmonJob, config: JSON.stringify({ api_url: 'http://patchmon:3000/api/v1/api/hosts', api_token: '' }) })
const id = (await request(app).get('/api/jobs')).body[0].id
const res = await request(app).post(`/api/jobs/${id}/run`)
expect(res.status).toBe(500)
})
it('run_on_create: triggers matching jobs when an instance is created', async () => {
createJob({ ...testJob, config: JSON.stringify({ api_key: 'k', tailnet: 't', run_on_create: true }) })
const id = (await request(app).get('/api/jobs')).body[0].id
vi.stubGlobal('fetch', vi.fn().mockResolvedValue({ ok: true, json: async () => ({ devices: [] }) }))
await request(app).post('/api/instances').send(base)
await new Promise(r => setImmediate(r))
const detail = await request(app).get(`/api/jobs/${id}`)
expect(detail.body.runs).toHaveLength(1)
expect(detail.body.runs[0].status).toBe('success')
})
it('run_on_create: does not trigger jobs without the flag', async () => {
createJob(testJob)
const id = (await request(app).get('/api/jobs')).body[0].id
await request(app).post('/api/instances').send(base)
await new Promise(r => setImmediate(r))
expect((await request(app).get(`/api/jobs/${id}`)).body.runs).toHaveLength(0)
})
it('semaphore_sync: parses ansible inventory and updates instances', async () => {
const semaphoreJob = {
key: 'semaphore_sync', name: 'Semaphore Sync', description: 'test',
enabled: 0, schedule: 60,
config: JSON.stringify({ api_url: 'http://semaphore:3000/api/project/1/inventory/1', api_token: 'bearer-token' }),
}
createJob(semaphoreJob)
const id = (await request(app).get('/api/jobs')).body[0].id
vi.stubGlobal('fetch', vi.fn().mockResolvedValueOnce({
ok: true,
json: async () => ({ inventory: '[production]\nplex\nhomeassistant\n' }),
}))
const res = await request(app).post(`/api/jobs/${id}/run`)
expect(res.status).toBe(200)
expect(res.body.summary).toMatch(/updated of/)
})
})

View File

@@ -3,6 +3,8 @@ import {
_resetForTest, _resetForTest,
getInstances, getInstance, getDistinctStacks, getInstances, getInstance, getDistinctStacks,
createInstance, updateInstance, deleteInstance, importInstances, getInstanceHistory, createInstance, updateInstance, deleteInstance, importInstances, getInstanceHistory,
getConfig, setConfig,
getJobs, getJob, createJob, updateJob, createJobRun, completeJobRun, getJobRuns,
} from '../server/db.js' } from '../server/db.js'
beforeEach(() => _resetForTest()); beforeEach(() => _resetForTest());
@@ -202,6 +204,15 @@ describe('importInstances', () => {
importInstances([{ ...base, name: 'new', vmid: 2 }]); importInstances([{ ...base, name: 'new', vmid: 2 }]);
expect(getInstanceHistory(1)).toHaveLength(0); expect(getInstanceHistory(1)).toHaveLength(0);
}); });
it('restores history rows when provided', () => {
importInstances(
[{ ...base, name: 'a', vmid: 1 }],
[{ vmid: 1, field: 'created', old_value: null, new_value: null, changed_at: '2026-01-01 00:00:00' }]
);
const h = getInstanceHistory(1);
expect(h.some(e => e.field === 'created')).toBe(true);
});
}); });
// ── instance history ───────────────────────────────────────────────────────── // ── instance history ─────────────────────────────────────────────────────────
@@ -260,3 +271,118 @@ describe('test environment boot isolation', () => {
expect(getInstances()).toEqual([]); expect(getInstances()).toEqual([]);
}); });
}); });
// ── getConfig / setConfig ─────────────────────────────────────────────────────
describe('getConfig / setConfig', () => {
it('returns defaultVal when key does not exist', () => {
expect(getConfig('missing', 'fallback')).toBe('fallback');
});
it('returns empty string by default', () => {
expect(getConfig('missing')).toBe('');
});
it('stores and retrieves a value', () => {
setConfig('tailscale_api_key', 'tskey-test');
expect(getConfig('tailscale_api_key')).toBe('tskey-test');
});
it('overwrites an existing key', () => {
setConfig('tailscale_enabled', '0');
setConfig('tailscale_enabled', '1');
expect(getConfig('tailscale_enabled')).toBe('1');
});
it('config is cleared by _resetForTest', () => {
setConfig('tailscale_api_key', 'tskey-test');
_resetForTest();
expect(getConfig('tailscale_api_key')).toBe('');
});
});
// ── jobs ──────────────────────────────────────────────────────────────────────
const baseJob = {
key: 'test_job', name: 'Test Job', description: 'desc',
enabled: 0, schedule: 15, config: '{}',
};
describe('jobs', () => {
it('returns empty array when no jobs', () => {
expect(getJobs()).toEqual([]);
});
it('createJob + getJobs returns the job', () => {
createJob(baseJob);
expect(getJobs()).toHaveLength(1);
expect(getJobs()[0].name).toBe('Test Job');
});
it('getJob returns null for unknown id', () => {
expect(getJob(999)).toBeNull();
});
it('updateJob changes enabled and schedule', () => {
createJob(baseJob);
const id = getJobs()[0].id;
updateJob(id, { enabled: 1, schedule: 30, config: '{}' });
expect(getJob(id).enabled).toBe(1);
expect(getJob(id).schedule).toBe(30);
});
it('getJobs includes last_status null when no runs', () => {
createJob(baseJob);
expect(getJobs()[0].last_status).toBeNull();
});
it('getJobs reflects last_status after a run', () => {
createJob(baseJob);
const id = getJobs()[0].id;
const runId = createJobRun(id);
completeJobRun(runId, 'success', 'ok');
expect(getJobs()[0].last_status).toBe('success');
});
});
// ── job_runs ──────────────────────────────────────────────────────────────────
describe('job_runs', () => {
it('createJobRun returns a positive id', () => {
createJob(baseJob);
const id = getJobs()[0].id;
expect(createJobRun(id)).toBeGreaterThan(0);
});
it('new run has status running and no ended_at', () => {
createJob(baseJob);
const id = getJobs()[0].id;
const runId = createJobRun(id);
const runs = getJobRuns(id);
expect(runs[0].status).toBe('running');
expect(runs[0].ended_at).toBeNull();
});
it('completeJobRun sets status, result, and ended_at', () => {
createJob(baseJob);
const id = getJobs()[0].id;
const runId = createJobRun(id);
completeJobRun(runId, 'success', '2 updated of 8');
const run = getJobRuns(id)[0];
expect(run.status).toBe('success');
expect(run.result).toBe('2 updated of 8');
expect(run.ended_at).not.toBeNull();
});
it('getJobRuns returns newest first', () => {
createJob(baseJob);
const id = getJobs()[0].id;
const r1 = createJobRun(id);
const r2 = createJobRun(id);
completeJobRun(r1, 'success', 'first');
completeJobRun(r2, 'error', 'second');
const runs = getJobRuns(id);
expect(runs[0].id).toBe(r2);
expect(runs[1].id).toBe(r1);
});
});