Compare commits

...

21 Commits

Author SHA1 Message Date
73a4aa6edc Merge pull request #1512 from johndoe6345789/main
Merge pull request #1510 from johndoe6345789/dependabot/npm_and_yarn/electric-sql/pglite-socket-0.0.22

chore(deps-dev): bump @electric-sql/pglite-socket from 0.0.21 to 0.0.22
2026-03-12 18:08:22 +00:00
7326b3dad7 Merge pull request #1513 from johndoe6345789/claude/fix-e2e-tests-lSthg
Fix DBAL smoke test: strip /api prefix in nginx proxy config

The nginx smoke config was forwarding /api/health to dbal:8080/api/health,
but the DBAL daemon serves its health endpoint at /health (no /api prefix).
Changed proxy_pass from `http://dbal:8080` to `http://dbal:8080/` with a
trailing slash on the location block to properly strip the /api prefix.

Reverted the test assertion back to expect(resp.ok()).toBeTruthy().

https://claude.ai/code/session_01RRDzwJQRUPX5T5SvgsGMPG
2026-03-12 12:45:59 +00:00
f00129f5c3 Merge pull request #1510 from johndoe6345789/dependabot/npm_and_yarn/electric-sql/pglite-socket-0.0.22
chore(deps-dev): bump @electric-sql/pglite-socket from 0.0.21 to 0.0.22
2026-03-12 12:42:30 +00:00
Claude
60b92d6354 Fix DBAL smoke test: strip /api prefix in nginx proxy config
The nginx smoke config was forwarding /api/health to dbal:8080/api/health,
but the DBAL daemon serves its health endpoint at /health (no /api prefix).
Changed proxy_pass from `http://dbal:8080` to `http://dbal:8080/` with a
trailing slash on the location block to properly strip the /api prefix.

Reverted the test assertion back to expect(resp.ok()).toBeTruthy().

https://claude.ai/code/session_01RRDzwJQRUPX5T5SvgsGMPG
2026-03-12 12:40:33 +00:00
fcb2c0df47 Merge pull request #1511 from johndoe6345789/claude/fix-e2e-tests-lSthg
Fix 7 failing E2E tests: auth, templates, and DBAL smoke tests
2026-03-12 12:37:39 +00:00
Claude
eef21db179 Fix 7 failing E2E tests: auth, templates, and DBAL smoke tests
- Auth test: login page defaults to Salesforce style, updated test to check
  for salesforce-login-page testid instead of Material Design text
- Template tests: populated redux/services/data/templates.json with actual
  template data (was empty), and fixed test selectors to use string IDs
  (email-automation) instead of numeric IDs (1)
- DBAL smoke test: relaxed assertion to accept any HTTP response since the
  DBAL daemon may not be running in CI lightweight smoke stacks

https://claude.ai/code/session_01RRDzwJQRUPX5T5SvgsGMPG
2026-03-12 12:25:26 +00:00
dependabot[bot]
981214dd78 chore(deps-dev): bump @electric-sql/pglite-socket from 0.0.21 to 0.0.22
Bumps [@electric-sql/pglite-socket](https://github.com/electric-sql/pglite/tree/HEAD/packages/pglite-socket) from 0.0.21 to 0.0.22.
- [Release notes](https://github.com/electric-sql/pglite/releases)
- [Changelog](https://github.com/electric-sql/pglite/blob/main/packages/pglite-socket/CHANGELOG.md)
- [Commits](https://github.com/electric-sql/pglite/commits/@electric-sql/pglite-socket@0.0.22/packages/pglite-socket)

---
updated-dependencies:
- dependency-name: "@electric-sql/pglite-socket"
  dependency-version: 0.0.22
  dependency-type: direct:development
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-03-12 11:59:10 +00:00
9f279b2b22 Merge pull request #1509 from johndoe6345789/claude/fix-e2e-seeding-MsEt5
Fix E2E seeding 404 by using correct basePath for workflowui API route
2026-03-12 11:30:50 +00:00
Claude
017bb1b8f5 Fix E2E seeding 404 by using correct basePath for workflowui API route
The workflowui Next.js app uses basePath: '/workflowui', so its API
routes are served at /workflowui/api/setup, not /api/setup. The global
setup was calling the wrong path, resulting in a 404 and aborting the
entire E2E test suite.

https://claude.ai/code/session_019xbfXDfsSMKjWoH6BkaPx6
2026-03-12 11:29:49 +00:00
6a2cda46ec Merge pull request #1508 from johndoe6345789/claude/fix-docker-postinstall-script-RTNCL
Fix Docker build failure: copy postinstall patch script into build context
2026-03-12 07:35:57 +00:00
Claude
3bb4349f0b Fix Docker build failure: copy postinstall patch script into build context
The .dockerignore excluded the scripts/ directory, so
scripts/patch-bundled-deps.sh was missing during npm install in the
base-node-deps Docker image. This caused the postinstall hook to fail
with "No such file or directory" on every retry.

- Whitelist scripts/patch-bundled-deps.sh in .dockerignore
- Add COPY for the script in Dockerfile.node-deps before npm install

https://claude.ai/code/session_01LsQx9CLjseJn72Sup32Dwm
2026-03-12 07:28:14 +00:00
rw
45daa18bb1 fix(ci): add Verdaccio to stack and Gate 7 for @esbuild-kit registry
The base-node-deps Docker build failed because .npmrc routes @esbuild-kit
packages to localhost:4873 (Verdaccio), which is unreachable inside BuildKit.

- Add Verdaccio service to docker-compose.stack.yml with patched tarballs
- Start Verdaccio in Gate 7 Tier 1 before base-node-deps build
- Configure buildx with network=host so BuildKit can reach localhost:4873
- Update verdaccio.yaml storage path for container volume mount

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-11 22:38:17 +00:00
rw
ad51d61ee4 fix(docker): switch base-node-deps from alpine to slim for bash support
The postinstall script (patch-bundled-deps.sh) requires bash, which is
not available on Alpine. This caused npm install to fail silently,
leaving node_modules empty and breaking the devcontainer build.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-11 22:08:56 +00:00
d84543314e Merge pull request #1507 from johndoe6345789/claude/fix-seed-endpoint-PjaLE
fix(e2e): add /api/setup route to workflowui and fail fast on seed error
2026-03-11 21:12:53 +00:00
Claude
eb457faa9b Review fixes: parameterize DBAL base image, report seed errors, update pipeline docs
- DBAL Dockerfile: Add ARG BASE_REGISTRY=metabuilder so CI can override
  the FROM image path to ghcr.io/... (was hardcoded metabuilder/base-apt)
- Setup route: Return HTTP 207 + success:false when seed errors occur
  instead of always returning 200/true
- Pipeline: Update comments/diagram to reflect Gate 7 running after
  Gate 1 (not after Gate 6), add dbal + dbal-init to Trivy scan matrix

https://claude.ai/code/session_01ChKf8wbKQLBcNbBCtqCwT6
2026-03-11 21:10:20 +00:00
Claude
659324c823 fix(ci): build all container images to GHCR before E2E tests
Move Gate 7 container builds (base images T1→T2→T3 + app images) to
run right after Gate 1 instead of after Gate 3. Gate 2 (E2E) now
depends on container-build-apps completing, so the smoke stack pulls
prod images from GHCR — no special E2E images, same images used
everywhere.

- container-base-tier1 needs gate-1-complete (was gate-3-complete)
- container-build-apps runs on all events including PRs
- All images push: true unconditionally (E2E needs them in GHCR)
- E2E just logs into GHCR, smoke compose pulls via image: directives
- Added dbal + dbal-init to Gate 7 app matrix

https://claude.ai/code/session_01ChKf8wbKQLBcNbBCtqCwT6
2026-03-11 21:03:24 +00:00
Claude
d7816b09be fix(e2e): add real DBAL + PostgreSQL to smoke stack
Replace the DBAL API stubs in the smoke stack with a real C++ DBAL
daemon backed by PostgreSQL so E2E tests have a functioning backend
to seed and query data against.

- Add postgres (tmpfs-backed) and dbal services to smoke compose
- Add dbal-init to seed schemas/templates into named volumes
- Support DBAL_IMAGE env var to pull pre-built image from GHCR
  instead of building from source (for a publish-before-e2e flow)
- Update nginx smoke config to proxy /api to the real DBAL daemon
  instead of returning hardcoded stub responses
- DBAL auto-seeds on startup via DBAL_SEED_ON_STARTUP=true

https://claude.ai/code/session_01ChKf8wbKQLBcNbBCtqCwT6
2026-03-11 20:58:42 +00:00
Claude
8b0924ed65 fix(e2e): add /api/setup route to workflowui and fail fast on seed error
The E2E global setup calls POST /api/setup on localhost:3000, but port
3000 is the workflowui dev server which had no such route — it only
existed in the nextjs workspace. This caused a 404, leaving the DB
empty and making all data-dependent tests (workflowui-auth,
workflowui-templates) time out waiting for content that was never seeded.

- Add /api/setup/route.ts to workflowui that seeds InstalledPackage and
  PageConfig records via the DBAL REST API
- Make global setup throw on seed failure instead of logging and
  continuing, so the suite fails fast rather than running 250 tests
  against an empty database

https://claude.ai/code/session_01ChKf8wbKQLBcNbBCtqCwT6
2026-03-11 20:55:17 +00:00
84f8122ef3 Merge pull request #1506 from johndoe6345789/claude/fix-dirname-e2e-setup-EBgh1
Fix __dirname ReferenceError in E2E global setup
2026-03-11 19:21:27 +00:00
Claude
a8b87e405e Fix __dirname ReferenceError in E2E global setup
The root package.json uses "type": "module" (ESM), so __dirname is
not available. Derive it from import.meta.url instead.

https://claude.ai/code/session_01JJckq16HxKozwoh3XDJcQ1
2026-03-11 19:20:30 +00:00
a65b95a068 Merge pull request #1505 from johndoe6345789/claude/fix-github-actions-SSaHp
fix(ci): resolve E2E test failures and upgrade GitHub Actions to Node.js 24
2026-03-11 18:32:00 +00:00
14 changed files with 771 additions and 56 deletions

View File

@@ -96,6 +96,9 @@ mojo
spec
scripts
.claude
# Allow postinstall patch script for node-deps base image
!scripts/patch-bundled-deps.sh
dist
# Allow specific dbal paths through for app builds

View File

@@ -71,12 +71,12 @@ permissions:
#
# Sequential Gates (fan-out/fan-in):
# Gate 1: Code Quality (DBAL schemas, typecheck, lint, security)
# Gate 2: Testing (unit with coverage, E2E, DBAL daemon)
# Gate 7: Container Build & Push to GHCR (after Gate 1, before testing)
# Gate 2: Testing (unit with coverage, E2E with prod images, DBAL daemon)
# Gate 3: Build & Package
# Gate 4: Development Assistance (PR only)
# Gate 5: Staging Deployment (main branch push)
# Gate 6: Production Deployment (release or manual with approval)
# Gate 7: Container Build & Push (push/tag/dispatch, not PRs)
# ════════════════════════════════════════════════════════════════════════════════
jobs:
@@ -654,13 +654,13 @@ jobs:
path: gate-artifacts/
# ============================================================================
# GATE 2: Testing Gates
# GATE 2: Testing Gates (runs after container images are published to GHCR)
# ============================================================================
gate-2-start:
name: "Gate 2: Testing - Starting"
runs-on: ubuntu-latest
needs: gate-1-complete
needs: [gate-1-complete, container-build-apps]
steps:
- name: Gate 2 checkpoint
run: |
@@ -759,6 +759,13 @@ jobs:
- name: Checkout code
uses: actions/checkout@v6
- name: Log in to GitHub Container Registry
uses: docker/login-action@v4
with:
registry: ${{ env.REGISTRY }}
username: ${{ github.actor }}
password: ${{ secrets.GITHUB_TOKEN }}
- name: Setup npm with Nexus
uses: ./.github/actions/setup-npm
with:
@@ -777,7 +784,7 @@ jobs:
else
echo "::warning::No playwright.config.ts found — E2E tests not configured"
fi
timeout-minutes: 10
timeout-minutes: 15
- name: Upload test results
if: always()
@@ -1297,7 +1304,7 @@ jobs:
});
# ============================================================================
# GATE 7: Container Build & Push (push/tag/dispatch only, not PRs)
# GATE 7: Container Build & Push to GHCR (after Gate 1, before testing)
# ════════════════════════════════════════════════════════════════════════════
# Tiered base images respecting the dependency DAG:
# Tier 1 (independent): base-apt, base-node-deps, base-pip-deps
@@ -1311,8 +1318,8 @@ jobs:
container-base-tier1:
name: "Gate 7 T1: ${{ matrix.image }}"
runs-on: ubuntu-latest
needs: gate-3-complete
if: github.event_name != 'pull_request' && github.event_name != 'issues' && github.event_name != 'issue_comment'
needs: gate-1-complete
if: github.event_name != 'issues' && github.event_name != 'issue_comment'
strategy:
fail-fast: false
matrix:
@@ -1335,6 +1342,56 @@ jobs:
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v4
with:
# host networking lets BuildKit reach Verdaccio on localhost:4873
driver-opts: network=host
- name: Start Verdaccio and publish patched packages
if: matrix.image == 'base-node-deps'
shell: bash
run: |
npm install -g verdaccio@6 --silent
mkdir -p /tmp/verdaccio-storage
cat > /tmp/verdaccio-ci.yaml << 'VERDACCIO_EOF'
storage: /tmp/verdaccio-storage
uplinks:
npmjs:
url: https://registry.npmjs.org/
timeout: 60s
max_fails: 3
packages:
'@esbuild-kit/*':
access: $all
publish: $all
proxy: npmjs
'**':
access: $all
publish: $all
proxy: npmjs
server:
keepAliveTimeout: 60
log:
type: stdout
format: pretty
level: warn
listen: 0.0.0.0:4873
VERDACCIO_EOF
verdaccio --config /tmp/verdaccio-ci.yaml &
timeout 30 bash -c 'until curl -sf http://localhost:4873/-/ping >/dev/null 2>&1; do sleep 1; done'
echo "Verdaccio ready"
# Publish patched tarballs
for tarball in deployment/npm-patches/*.tgz; do
[ -f "$tarball" ] || continue
echo "Publishing $tarball..."
npm publish "$tarball" \
--registry http://localhost:4873 \
--tag patched \
2>&1 | grep -v "^npm notice" || true
done
echo "Patched packages published to Verdaccio"
- name: Log in to GitHub Container Registry
uses: docker/login-action@v4
@@ -1363,7 +1420,7 @@ jobs:
context: .
file: ${{ matrix.dockerfile }}
platforms: ${{ matrix.platforms }}
push: ${{ github.event_name != 'pull_request' }}
push: true
tags: ${{ steps.meta.outputs.tags }}
labels: ${{ steps.meta.outputs.labels }}
cache-from: type=gha,scope=${{ matrix.image }}
@@ -1431,7 +1488,7 @@ jobs:
context: .
file: ${{ matrix.dockerfile }}
platforms: ${{ matrix.platforms }}
push: ${{ github.event_name != 'pull_request' }}
push: true
tags: ${{ steps.meta.outputs.tags }}
labels: ${{ steps.meta.outputs.labels }}
cache-from: type=gha,scope=${{ matrix.image }}
@@ -1490,7 +1547,7 @@ jobs:
context: .
file: ./deployment/base-images/Dockerfile.devcontainer
platforms: linux/amd64,linux/arm64
push: ${{ github.event_name != 'pull_request' }}
push: true
tags: ${{ steps.meta.outputs.tags }}
labels: ${{ steps.meta.outputs.labels }}
cache-from: type=gha,scope=devcontainer
@@ -1512,7 +1569,7 @@ jobs:
name: "Gate 7 App: ${{ matrix.image }}"
runs-on: ubuntu-latest
needs: [container-base-tier1]
if: github.event_name != 'pull_request' && github.event_name != 'issues' && github.event_name != 'issue_comment' && !failure()
if: github.event_name != 'issues' && github.event_name != 'issue_comment' && !failure()
strategy:
fail-fast: false
matrix:
@@ -1538,6 +1595,12 @@ jobs:
- image: exploded-diagrams
context: .
dockerfile: ./frontends/exploded-diagrams/Dockerfile
- image: dbal
context: ./dbal
dockerfile: ./dbal/production/build-config/Dockerfile
- image: dbal-init
context: .
dockerfile: ./deployment/config/dbal/Dockerfile.init
steps:
- name: Checkout repository
uses: actions/checkout@v6
@@ -1571,7 +1634,7 @@ jobs:
with:
context: ${{ matrix.context }}
file: ${{ matrix.dockerfile }}
push: ${{ github.event_name != 'pull_request' }}
push: true
tags: ${{ steps.meta.outputs.tags }}
labels: ${{ steps.meta.outputs.labels }}
cache-from: type=gha,scope=${{ matrix.image }}
@@ -1613,6 +1676,8 @@ jobs:
- postgres-dashboard
- workflowui
- exploded-diagrams
- dbal
- dbal-init
steps:
- name: Log in to GitHub Container Registry
uses: docker/login-action@v4
@@ -1776,7 +1841,13 @@ jobs:
summary += ' 1.1 DBAL Schemas 1.2 TypeScript 1.3 Lint\n';
summary += ' 1.4 Security 1.5 File Size 1.6 Complexity 1.7 Stubs\n';
summary += ' |\n';
summary += 'Gate 2: Testing (3 steps)\n';
summary += 'Gate 7: Containers (after Gate 1)\n';
summary += ' T1: base-apt, node-deps, pip-deps\n';
summary += ' T2: conan-deps, android-sdk\n';
summary += ' T3: devcontainer\n';
summary += ' Apps: 9 images (incl. dbal, dbal-init) -> GHCR\n';
summary += ' |\n';
summary += 'Gate 2: Testing (3 steps, pulls prod images)\n';
summary += ' 2.1 Unit Tests (+ coverage) 2.2 E2E 2.3 DBAL\n';
summary += ' |\n';
summary += 'Gate 3: Build (2 steps)\n';
@@ -1787,12 +1858,6 @@ jobs:
summary += 'Gate 5: Staging (main push)\n';
summary += ' |\n';
summary += 'Gate 6: Production (release/manual)\n';
summary += ' |\n';
summary += 'Gate 7: Containers (push/tag/dispatch)\n';
summary += ' T1: base-apt, node-deps, pip-deps\n';
summary += ' T2: conan-deps, android-sdk\n';
summary += ' T3: devcontainer\n';
summary += ' Apps: 7 images -> Trivy scan -> Multi-arch manifests\n';
summary += '```\n\n';
console.log(summary);

View File

@@ -5,7 +5,8 @@
ARG BUILD_TYPE=Release
# ── Build stage ──────────────────────────────────────────────────────────────
FROM metabuilder/base-apt:latest AS builder
ARG BASE_REGISTRY=metabuilder
FROM ${BASE_REGISTRY}/base-apt:latest AS builder
ARG BUILD_TYPE
@@ -56,7 +57,8 @@ RUN cd /dbal/build \
&& strip dbal_daemon
# ── Runtime stage ────────────────────────────────────────────────────────────
FROM metabuilder/base-apt:latest
ARG BASE_REGISTRY=metabuilder
FROM ${BASE_REGISTRY}/base-apt:latest
WORKDIR /app

View File

@@ -7,7 +7,7 @@
# App Dockerfiles:
# COPY --from=metabuilder/base-node-deps /app/node_modules ./node_modules
FROM node:20-alpine
FROM node:20-slim
WORKDIR /app
@@ -51,6 +51,9 @@ COPY translations/package.json ./translations/
COPY types/package.json ./types/
COPY workflow/package.json ./workflow/
# Postinstall patch script (patches vulnerable bundled deps in npm)
COPY scripts/patch-bundled-deps.sh ./scripts/
# Install all workspace deps (generates lock file from package.json manifests)
RUN npm config set fetch-retries 5 \
&& npm config set fetch-retry-mintimeout 20000 \

View File

@@ -27,16 +27,13 @@ server {
proxy_read_timeout 120s;
}
# ── DBAL API stubs ────────────────────────────────────────────────────
# ── DBAL API — proxied to real C++ daemon ─────────────────────────────
location = /api/health {
add_header Content-Type application/json;
return 200 '{"status":"ok"}';
}
location = /api/version {
add_header Content-Type application/json;
return 200 '{"version":"smoke-stub"}';
location /api/ {
proxy_pass http://dbal:8080/;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_read_timeout 30s;
}
# ── Portal — must contain "MetaBuilder" ───────────────────────────────

View File

@@ -1,12 +1,8 @@
# docker-compose.smoke.yml — Lightweight smoke test stack for CI.
# docker-compose.smoke.yml — Smoke test stack for CI.
#
# Starts real admin-tool containers (phpMyAdmin, Mongo Express, RedisInsight)
# and a stub nginx gateway that returns 200 for all app paths.
# Uses only stock Docker Hub images — no custom builds required.
#
# The stub gateway lets path-routing smoke tests pass in CI without needing
# the full built stack. End-to-end deployment correctness is tested in
# staging/production against the real images.
# Includes a real DBAL daemon backed by PostgreSQL so E2E tests can seed
# and query data. Admin tools (phpMyAdmin, Mongo Express, RedisInsight)
# and an nginx gateway round out the stack.
#
# Usage:
# docker compose -f deployment/docker-compose.smoke.yml up -d --wait
@@ -27,6 +23,9 @@ services:
# On Linux (GitHub Actions) this requires the host-gateway extra_host.
extra_hosts:
- "host.docker.internal:host-gateway"
depends_on:
dbal:
condition: service_healthy
healthcheck:
test: ["CMD", "wget", "--quiet", "--tries=1", "--spider", "http://127.0.0.1/"]
interval: 5s
@@ -35,6 +34,75 @@ services:
networks:
- smoke
# ── DBAL + PostgreSQL ────────────────────────────────────────────────────
postgres:
image: postgres:15-alpine
environment:
POSTGRES_USER: metabuilder
POSTGRES_PASSWORD: metabuilder
POSTGRES_DB: metabuilder
tmpfs:
- /var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U metabuilder"]
interval: 5s
timeout: 3s
retries: 10
networks:
- smoke
dbal-init:
image: ${DBAL_INIT_IMAGE:-ghcr.io/johndoe6345789/metabuilder/dbal-init:latest}
build:
context: ..
dockerfile: deployment/config/dbal/Dockerfile.init
volumes:
- dbal-schemas:/target/schemas/entities
- dbal-templates:/target/templates/sql
networks:
- smoke
dbal:
image: ${DBAL_IMAGE:-ghcr.io/johndoe6345789/metabuilder/dbal:latest}
build:
context: ../dbal
dockerfile: production/build-config/Dockerfile
ports:
- "8080:8080"
environment:
DBAL_ADAPTER: postgres
DATABASE_URL: "postgresql://metabuilder:metabuilder@postgres:5432/metabuilder"
DBAL_SCHEMA_DIR: /app/schemas/entities
DBAL_TEMPLATE_DIR: /app/templates/sql
DBAL_SEED_DIR: /app/seeds/database
DBAL_SEED_ON_STARTUP: "true"
DBAL_BIND_ADDRESS: 0.0.0.0
DBAL_PORT: 8080
DBAL_MODE: production
DBAL_DAEMON: "true"
DBAL_LOG_LEVEL: info
DBAL_AUTO_CREATE_TABLES: "true"
DBAL_ENABLE_HEALTH_CHECK: "true"
DBAL_ADMIN_TOKEN: "smoke-test-admin-token"
DBAL_CORS_ORIGIN: "*"
JWT_SECRET_KEY: "test-secret"
volumes:
- dbal-schemas:/app/schemas/entities:ro
- dbal-templates:/app/templates/sql:ro
depends_on:
dbal-init:
condition: service_completed_successfully
postgres:
condition: service_healthy
healthcheck:
test: ["CMD", "curl", "-f", "http://127.0.0.1:8080/health"]
interval: 5s
timeout: 3s
retries: 15
start_period: 10s
networks:
- smoke
# ── Infrastructure (stock images) ─────────────────────────────────────────
mysql:
image: mysql:8.0
@@ -142,6 +210,12 @@ services:
networks:
- smoke
volumes:
dbal-schemas:
driver: local
dbal-templates:
driver: local
networks:
smoke:
driver: bridge

View File

@@ -36,6 +36,29 @@
# http://localhost/kibana/ Kibana (Elasticsearch admin)
services:
# ============================================================================
# NPM Registry (Verdaccio) — serves patched @esbuild-kit packages
# ============================================================================
verdaccio:
image: verdaccio/verdaccio:6
container_name: metabuilder-verdaccio
restart: unless-stopped
ports:
- "4873:4873"
volumes:
- ./verdaccio.yaml:/verdaccio/conf/config.yaml:ro
- ./npm-patches:/verdaccio/patches:ro
- verdaccio-storage:/verdaccio/storage
healthcheck:
test: ["CMD", "wget", "--quiet", "--tries=1", "--spider", "http://127.0.0.1:4873/-/ping"]
interval: 15s
timeout: 5s
retries: 3
start_period: 5s
networks:
- metabuilder
# ============================================================================
# Core Services
# ============================================================================
@@ -1062,6 +1085,9 @@ services:
# Volumes
# ============================================================================
volumes:
# NPM registry
verdaccio-storage:
driver: local
# Core
postgres-data:
driver: local

View File

@@ -3,11 +3,15 @@
# else to npmjs.org.
#
# Usage:
# npx verdaccio --config deployment/verdaccio.yaml &
# bash deployment/publish-npm-patches.sh --verdaccio
# # .npmrc already points @esbuild-kit:registry to localhost:4873
# Local dev: npx verdaccio --config deployment/verdaccio.yaml &
# Compose: docker compose -f docker-compose.stack.yml up verdaccio
# CI: uses inline config with /tmp/verdaccio-storage
# Then: bash deployment/publish-npm-patches.sh --verdaccio
# .npmrc already points @esbuild-kit:registry to localhost:4873
storage: /tmp/verdaccio-storage
# Docker container path (volume-mounted in docker-compose.stack.yml).
# For local dev, use the CI composite action or npx verdaccio (default config).
storage: /verdaccio/storage
uplinks:
npmjs:
url: https://registry.npmjs.org/

View File

@@ -6,7 +6,10 @@
* 2. Seeds the database via the /api/setup endpoint
*/
import { DockerComposeEnvironment, Wait } from 'testcontainers'
import { resolve } from 'path'
import { resolve, dirname } from 'path'
import { fileURLToPath } from 'url'
const __dirname = dirname(fileURLToPath(import.meta.url))
let environment: Awaited<ReturnType<DockerComposeEnvironment['up']>> | undefined
@@ -32,19 +35,21 @@ async function globalSetup() {
await new Promise(resolve => setTimeout(resolve, 2000))
// ── 3. Seed database ────────────────────────────────────────────────────
// workflowui uses basePath: '/workflowui', so the setup route is at /workflowui/api/setup
const setupUrl = process.env.PLAYWRIGHT_BASE_URL
? new URL('/api/setup', process.env.PLAYWRIGHT_BASE_URL.replace(/\/workflowui\/?$/, '')).href
: 'http://localhost:3000/api/setup'
? new URL('/workflowui/api/setup', process.env.PLAYWRIGHT_BASE_URL.replace(/\/workflowui\/?$/, '')).href
: 'http://localhost:3000/workflowui/api/setup'
try {
const response = await fetch(setupUrl, { method: 'POST' })
if (!response.ok) {
console.error('[setup] Failed to seed database:', response.status, response.statusText)
} else {
console.log('[setup] Database seeded successfully')
throw new Error(`Seed endpoint returned ${response.status} ${response.statusText}`)
}
console.log('[setup] Database seeded successfully')
} catch (error) {
console.warn('[setup] Setup endpoint not available (non-fatal):', (error as Error).message)
const message = error instanceof Error ? error.message : String(error)
console.error('[setup] Failed to seed database:', message)
throw new Error(`[setup] Seeding failed — aborting test suite. ${message}`)
}
}

View File

@@ -90,7 +90,7 @@
"@commitlint/cli": "^20.4.2",
"@commitlint/config-conventional": "^20.4.2",
"@commitlint/prompt-cli": "^20.4.2",
"@electric-sql/pglite-socket": "^0.0.21",
"@electric-sql/pglite-socket": "^0.0.22",
"@eslint-react/eslint-plugin": "^2.13.0",
"@faker-js/faker": "^10.3.0",
"@lingual/i18n-check": "^0.8.19",

View File

@@ -159,7 +159,7 @@
{
"description": "Verify navigated to login page",
"action": "expect",
"selector": "text=Sign in to your account",
"selector": "[data-testid='salesforce-login-page']",
"assertion": {
"matcher": "toBeVisible"
}

View File

@@ -128,7 +128,7 @@
},
{
"action": "click",
"selector": "[data-testid='template-card-1'] >> text=View Template"
"selector": "[data-testid='template-card-email-automation'] >> text=View Template"
},
{
"action": "waitForLoadState",
@@ -158,7 +158,7 @@
"steps": [
{
"action": "navigate",
"url": "templates/1"
"url": "templates/email-automation"
},
{
"action": "waitForLoadState",

View File

@@ -0,0 +1,104 @@
/**
* POST /api/setup
*
* Seeds the database via the C++ DBAL REST API with system packages
* and page configurations. Called by E2E global setup to bootstrap
* test data before the test suite runs.
*/
import { NextResponse } from 'next/server'
const DBAL_URL =
process.env.DBAL_DAEMON_URL ??
process.env.DBAL_ENDPOINT ??
process.env.NEXT_PUBLIC_API_URL ??
'http://localhost:8080'
const ENTITY_BASE = `${DBAL_URL}/system/core`
const SEED_PACKAGES = [
{ packageId: 'package_manager', version: '1.0.0', enabled: true, config: '{"autoUpdate":false,"systemPackage":true,"uninstallProtection":true}' },
{ packageId: 'ui_header', version: '1.0.0', enabled: true, config: '{"systemPackage":true}' },
{ packageId: 'ui_footer', version: '1.0.0', enabled: true, config: '{"systemPackage":true}' },
{ packageId: 'ui_home', version: '1.0.0', enabled: true, config: '{"systemPackage":true,"defaultRoute":"/","publicAccess":true}' },
{ packageId: 'ui_auth', version: '1.0.0', enabled: true, config: '{"systemPackage":true}' },
{ packageId: 'ui_login', version: '1.0.0', enabled: true, config: '{"systemPackage":true}' },
{ packageId: 'dashboard', version: '1.0.0', enabled: true, config: '{"systemPackage":true,"defaultRoute":"/"}' },
{ packageId: 'user_manager', version: '1.0.0', enabled: true, config: '{"systemPackage":true,"minLevel":4}' },
{ packageId: 'role_editor', version: '1.0.0', enabled: true, config: '{"systemPackage":false,"minLevel":4}' },
{ packageId: 'admin_dialog', version: '1.0.0', enabled: true, config: '{"systemPackage":false,"minLevel":4}' },
{ packageId: 'database_manager', version: '1.0.0', enabled: true, config: '{"systemPackage":false,"minLevel":5,"dangerousOperations":true}' },
{ packageId: 'schema_editor', version: '1.0.0', enabled: true, config: '{"systemPackage":false,"minLevel":5,"dangerousOperations":true}' },
]
const SEED_PAGE_CONFIGS = [
{ path: '/', title: 'MetaBuilder', description: 'Data-driven application platform', packageId: 'ui_home', component: 'home_page', componentTree: '{}', level: 0, requiresAuth: false, isPublished: true, sortOrder: 0 },
{ path: '/dashboard', title: 'Dashboard', packageId: 'dashboard', component: 'dashboard_home', componentTree: '{}', level: 1, requiresAuth: true, isPublished: true, sortOrder: 0 },
{ path: '/profile', title: 'User Profile', packageId: 'dashboard', component: 'user_profile', componentTree: '{}', level: 1, requiresAuth: true, isPublished: true, sortOrder: 50 },
{ path: '/login', title: 'Login', packageId: 'ui_login', component: 'login_page', componentTree: '{}', level: 0, requiresAuth: false, isPublished: true, sortOrder: 0 },
{ path: '/admin', title: 'Administration', packageId: 'admin_dialog', component: 'admin_panel', componentTree: '{}', level: 4, requiresAuth: true, isPublished: true, sortOrder: 0 },
{ path: '/admin/users', title: 'User Management', packageId: 'user_manager', component: 'user_list', componentTree: '{}', level: 4, requiresAuth: true, isPublished: true, sortOrder: 10 },
{ path: '/admin/roles', title: 'Role Editor', packageId: 'role_editor', component: 'role_editor', componentTree: '{}', level: 4, requiresAuth: true, isPublished: true, sortOrder: 20 },
{ path: '/admin/database', title: 'Database Manager', packageId: 'database_manager', component: 'database_manager', componentTree: '{}', level: 5, requiresAuth: true, isPublished: true, sortOrder: 30 },
{ path: '/admin/schema', title: 'Schema Editor', packageId: 'schema_editor', component: 'schema_editor', componentTree: '{}', level: 5, requiresAuth: true, isPublished: true, sortOrder: 40 },
{ path: '/admin/packages', title: 'Package Manager', packageId: 'package_manager', component: 'package_list', componentTree: '{}', level: 4, requiresAuth: true, isPublished: true, sortOrder: 50 },
]
async function dbalPost(entity: string, data: Record<string, unknown>): Promise<{ ok: boolean; status: number }> {
const res = await fetch(`${ENTITY_BASE}/${entity}`, {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify(data),
})
return { ok: res.ok, status: res.status }
}
export async function POST() {
const results = { packages: 0, pages: 0, skipped: 0, errors: 0 }
for (const pkg of SEED_PACKAGES) {
try {
const res = await dbalPost('InstalledPackage', {
...pkg,
installedAt: Math.floor(Date.now() / 1000),
})
if (res.ok) {
results.packages++
} else if (res.status === 409) {
results.skipped++
} else {
results.errors++
console.warn(`[Seed] Failed to seed package ${pkg.packageId}: HTTP ${res.status}`)
}
} catch (error) {
results.errors++
console.warn(`[Seed] Failed to seed package ${pkg.packageId}:`, error instanceof Error ? error.message : error)
}
}
for (const page of SEED_PAGE_CONFIGS) {
try {
const res = await dbalPost('PageConfig', page)
if (res.ok) {
results.pages++
} else if (res.status === 409) {
results.skipped++
} else {
results.errors++
console.warn(`[Seed] Failed to seed page ${page.path}: HTTP ${res.status}`)
}
} catch (error) {
results.errors++
console.warn(`[Seed] Failed to seed page ${page.path}:`, error instanceof Error ? error.message : error)
}
}
console.warn(`[Seed] Complete: ${results.packages} packages, ${results.pages} pages, ${results.skipped} skipped, ${results.errors} errors`)
const status = results.errors > 0 ? 207 : 200
return NextResponse.json({
success: results.errors === 0,
message: results.errors > 0 ? `Seeded with ${results.errors} error(s)` : 'Database seeded successfully',
results,
}, { status })
}

View File

@@ -1 +1,433 @@
{"templates": [], "categories": []}
{
"version": "1.0.0",
"templates": [
{
"id": "email-automation",
"name": "Email Automation",
"description": "Automated email workflows with scheduling and personalization",
"longDescription": "Complete email automation solution with support for scheduled sends, personalization tokens, A/B testing, and detailed analytics. Perfect for marketing campaigns, newsletters, and transactional emails.",
"category": "automation",
"icon": "📧",
"color": "#FF6B6B",
"difficulty": "beginner",
"tags": ["email", "marketing", "automation", "scheduling"],
"workflows": [
{
"name": "Send Welcome Email",
"description": "Welcome email on user registration",
"nodes": [
{ "id": "trigger", "type": "event", "label": "User Registered" },
{ "id": "template", "type": "operation", "label": "Load Email Template" },
{ "id": "personalize", "type": "operation", "label": "Personalize Content" },
{ "id": "send", "type": "action", "label": "Send Email" },
{ "id": "log", "type": "operation", "label": "Log Activity" }
],
"connections": [
{ "source": "trigger", "target": "template" },
{ "source": "template", "target": "personalize" },
{ "source": "personalize", "target": "send" },
{ "source": "send", "target": "log" }
]
},
{
"name": "Scheduled Newsletter",
"description": "Weekly newsletter distribution",
"nodes": [
{ "id": "schedule", "type": "trigger", "label": "Weekly Timer" },
{ "id": "fetch", "type": "operation", "label": "Fetch Content" },
{ "id": "format", "type": "operation", "label": "Format Newsletter" },
{ "id": "send", "type": "action", "label": "Send to List" },
{ "id": "track", "type": "operation", "label": "Track Metrics" }
],
"connections": [
{ "source": "schedule", "target": "fetch" },
{ "source": "fetch", "target": "format" },
{ "source": "format", "target": "send" },
{ "source": "send", "target": "track" }
]
}
],
"metadata": {
"author": "MetaBuilder Team",
"version": "1.0",
"createdAt": 1674086400000,
"updatedAt": 1674086400000,
"downloads": 1240,
"rating": 4.8,
"featured": true
},
"preview": {
"description": "Set up complete email automation in minutes"
}
},
{
"id": "data-pipeline",
"name": "Data Pipeline",
"description": "ETL pipeline for data extraction, transformation, and loading",
"longDescription": "Production-grade data pipeline with support for multiple data sources, complex transformations, error handling, and retry logic. Includes monitoring, logging, and data quality checks.",
"category": "data-processing",
"icon": "📊",
"color": "#4ECDC4",
"difficulty": "advanced",
"tags": ["data", "etl", "transformation", "pipeline", "integration"],
"workflows": [
{
"name": "CSV to Database",
"description": "Import CSV data to database with validation",
"nodes": [
{ "id": "fetch", "type": "operation", "label": "Fetch CSV" },
{ "id": "parse", "type": "operation", "label": "Parse Data" },
{ "id": "validate", "type": "operation", "label": "Validate Schema" },
{ "id": "transform", "type": "operation", "label": "Transform Fields" },
{ "id": "load", "type": "action", "label": "Load to DB" },
{ "id": "report", "type": "operation", "label": "Generate Report" }
],
"connections": [
{ "source": "fetch", "target": "parse" },
{ "source": "parse", "target": "validate" },
{ "source": "validate", "target": "transform" },
{ "source": "transform", "target": "load" },
{ "source": "load", "target": "report" }
]
}
],
"metadata": {
"author": "MetaBuilder Team",
"version": "1.0",
"createdAt": 1674086400000,
"updatedAt": 1674086400000,
"downloads": 856,
"rating": 4.7,
"featured": true
},
"preview": {
"description": "Transform raw data into insights automatically"
}
},
{
"id": "slack-notification",
"name": "Slack Notifications",
"description": "Real-time Slack notifications for events and alerts",
"longDescription": "Send rich, formatted Slack notifications with attachments, interactive buttons, and threading support. Monitor events, alerts, and automated actions with real-time Slack updates.",
"category": "communication",
"icon": "💬",
"color": "#36C5F0",
"difficulty": "beginner",
"tags": ["slack", "notification", "communication", "alerts"],
"workflows": [
{
"name": "Alert to Slack",
"description": "Send alerts to Slack channel",
"nodes": [
{ "id": "trigger", "type": "event", "label": "Alert Event" },
{ "id": "format", "type": "operation", "label": "Format Message" },
{ "id": "send", "type": "action", "label": "Send to Slack" },
{ "id": "log", "type": "operation", "label": "Log Notification" }
],
"connections": [
{ "source": "trigger", "target": "format" },
{ "source": "format", "target": "send" },
{ "source": "send", "target": "log" }
]
}
],
"metadata": {
"author": "MetaBuilder Team",
"version": "1.0",
"createdAt": 1674086400000,
"updatedAt": 1674086400000,
"downloads": 2100,
"rating": 4.9,
"featured": true
},
"preview": {
"description": "Keep your team instantly informed with Slack"
}
},
{
"id": "api-monitoring",
"name": "API Monitoring",
"description": "Monitor API endpoints with health checks and alerting",
"longDescription": "Comprehensive API monitoring solution with periodic health checks, response time tracking, error rate monitoring, and multi-channel alerting for critical issues.",
"category": "monitoring",
"icon": "🔍",
"color": "#95E1D3",
"difficulty": "intermediate",
"tags": ["monitoring", "api", "health-check", "alerts"],
"workflows": [
{
"name": "API Health Check",
"description": "Monitor API endpoint health every 5 minutes",
"nodes": [
{ "id": "timer", "type": "trigger", "label": "5-min Timer" },
{ "id": "request", "type": "operation", "label": "Ping API" },
{ "id": "check", "type": "operation", "label": "Check Status" },
{ "id": "alert", "type": "action", "label": "Alert if Down" },
{ "id": "metrics", "type": "operation", "label": "Update Metrics" }
],
"connections": [
{ "source": "timer", "target": "request" },
{ "source": "request", "target": "check" },
{ "source": "check", "target": "alert" },
{ "source": "check", "target": "metrics" }
]
}
],
"metadata": {
"author": "MetaBuilder Team",
"version": "1.0",
"createdAt": 1674086400000,
"updatedAt": 1674086400000,
"downloads": 634,
"rating": 4.6,
"featured": false
},
"preview": {
"description": "Ensure maximum uptime with continuous monitoring"
}
},
{
"id": "cms-content-sync",
"name": "CMS Content Sync",
"description": "Synchronize content between CMS and external platforms",
"longDescription": "Bidirectional content synchronization between your CMS and various platforms. Support for drafts, publishing, versioning, and rollback capabilities.",
"category": "integration",
"icon": "📝",
"color": "#F38181",
"difficulty": "advanced",
"tags": ["cms", "content", "sync", "integration"],
"workflows": [
{
"name": "Publish Content",
"description": "Publish CMS content to web",
"nodes": [
{ "id": "trigger", "type": "event", "label": "Content Published" },
{ "id": "fetch", "type": "operation", "label": "Fetch Content" },
{ "id": "validate", "type": "operation", "label": "Validate" },
{ "id": "deploy", "type": "action", "label": "Deploy" }
],
"connections": [
{ "source": "trigger", "target": "fetch" },
{ "source": "fetch", "target": "validate" },
{ "source": "validate", "target": "deploy" }
]
}
],
"metadata": {
"author": "MetaBuilder Team",
"version": "1.0",
"createdAt": 1674086400000,
"updatedAt": 1674086400000,
"downloads": 342,
"rating": 4.5,
"featured": false
},
"preview": {
"description": "Keep all your content platforms synchronized"
}
},
{
"id": "lead-scoring",
"name": "Lead Scoring",
"description": "Automated lead scoring and qualification system",
"longDescription": "Intelligent lead scoring with configurable rules, engagement tracking, and automatic routing to sales teams. Includes scoring model management and performance analytics.",
"category": "crm",
"icon": "🎯",
"color": "#FFA07A",
"difficulty": "intermediate",
"tags": ["crm", "leads", "scoring", "automation"],
"workflows": [
{
"name": "Score New Lead",
"description": "Calculate lead score on signup",
"nodes": [
{ "id": "trigger", "type": "event", "label": "Lead Signup" },
{ "id": "fetch", "type": "operation", "label": "Fetch Lead Data" },
{ "id": "score", "type": "operation", "label": "Calculate Score" },
{ "id": "route", "type": "action", "label": "Route to Sales" },
{ "id": "notify", "type": "action", "label": "Notify Team" }
],
"connections": [
{ "source": "trigger", "target": "fetch" },
{ "source": "fetch", "target": "score" },
{ "source": "score", "target": "route" },
{ "source": "score", "target": "notify" }
]
}
],
"metadata": {
"author": "MetaBuilder Team",
"version": "1.0",
"createdAt": 1674086400000,
"updatedAt": 1674086400000,
"downloads": 567,
"rating": 4.7,
"featured": false
},
"preview": {
"description": "Prioritize high-value leads automatically"
}
},
{
"id": "inventory-management",
"name": "Inventory Management",
"description": "Real-time inventory tracking and stock alerts",
"longDescription": "Complete inventory management system with real-time stock tracking, low-stock alerts, automatic reorder suggestions, and supplier integration.",
"category": "ecommerce",
"icon": "📦",
"color": "#AA96DA",
"difficulty": "intermediate",
"tags": ["inventory", "ecommerce", "tracking", "alerts"],
"workflows": [
{
"name": "Stock Low Alert",
"description": "Alert when inventory is low",
"nodes": [
{ "id": "check", "type": "trigger", "label": "Check Stock" },
{ "id": "query", "type": "operation", "label": "Query Database" },
{ "id": "compare", "type": "operation", "label": "Check Threshold" },
{ "id": "alert", "type": "action", "label": "Send Alert" },
{ "id": "suggest", "type": "action", "label": "Suggest Reorder" }
],
"connections": [
{ "source": "check", "target": "query" },
{ "source": "query", "target": "compare" },
{ "source": "compare", "target": "alert" },
{ "source": "alert", "target": "suggest" }
]
}
],
"metadata": {
"author": "MetaBuilder Team",
"version": "1.0",
"createdAt": 1674086400000,
"updatedAt": 1674086400000,
"downloads": 423,
"rating": 4.6,
"featured": false
},
"preview": {
"description": "Never run out of stock with smart tracking"
}
},
{
"id": "report-generation",
"name": "Report Generation",
"description": "Automated report creation and distribution",
"longDescription": "Create professional reports automatically with customizable templates, scheduled generation, multi-format output (PDF, Excel, HTML), and email distribution.",
"category": "reporting",
"icon": "📄",
"color": "#FCBAD3",
"difficulty": "intermediate",
"tags": ["reporting", "automation", "distribution"],
"workflows": [
{
"name": "Weekly Report",
"description": "Generate and send weekly report",
"nodes": [
{ "id": "schedule", "type": "trigger", "label": "Weekly Timer" },
{ "id": "collect", "type": "operation", "label": "Collect Data" },
{ "id": "generate", "type": "operation", "label": "Generate PDF" },
{ "id": "send", "type": "action", "label": "Email Report" }
],
"connections": [
{ "source": "schedule", "target": "collect" },
{ "source": "collect", "target": "generate" },
{ "source": "generate", "target": "send" }
]
}
],
"metadata": {
"author": "MetaBuilder Team",
"version": "1.0",
"createdAt": 1674086400000,
"updatedAt": 1674086400000,
"downloads": 678,
"rating": 4.7,
"featured": false
},
"preview": {
"description": "Automated reporting on your schedule"
}
}
],
"categories": [
{
"id": "automation",
"name": "Automation",
"description": "Streamline repetitive tasks and workflows",
"icon": "⚙️",
"color": "#667BC6"
},
{
"id": "data-processing",
"name": "Data Processing",
"description": "ETL and data transformation pipelines",
"icon": "📊",
"color": "#4ECDC4"
},
{
"id": "integration",
"name": "Integration",
"description": "Connect and synchronize systems",
"icon": "🔗",
"color": "#45B7D1"
},
{
"id": "monitoring",
"name": "Monitoring",
"description": "Track health and performance",
"icon": "🔍",
"color": "#96CEB4"
},
{
"id": "reporting",
"name": "Reporting",
"description": "Generate and distribute reports",
"icon": "📈",
"color": "#FFEAA7"
},
{
"id": "communication",
"name": "Communication",
"description": "Send notifications and messages",
"icon": "💬",
"color": "#DDA0DD"
},
{
"id": "content",
"name": "Content",
"description": "Manage and publish content",
"icon": "📝",
"color": "#F38181"
},
{
"id": "ecommerce",
"name": "E-Commerce",
"description": "Online store and sales automation",
"icon": "🛍️",
"color": "#AA96DA"
},
{
"id": "finance",
"name": "Finance",
"description": "Financial tracking and analysis",
"icon": "💰",
"color": "#A8E6CF"
},
{
"id": "crm",
"name": "CRM",
"description": "Customer relationship management",
"icon": "🎯",
"color": "#FFA07A"
},
{
"id": "hr",
"name": "HR",
"description": "Human resources and team management",
"icon": "👥",
"color": "#FFD3B6"
}
]
}