Commit Graph

259 Commits

Author SHA1 Message Date
4dfded3533 Generated by Spark: @johndoe6345789 ➜ /workspaces/low-code-react-app-b (main) $ npm run build
> spark-template@0.0.0 prebuild
> mkdir -p /tmp/dist || true

> spark-template@0.0.0 build
> tsc -b --noCheck && vite build

vite v7.3.1 building client environment for production...
<script src="/runtime-config.js"> in "/index.html" can't be bundled without type="module" attribute
✓ 37 modules transformed.
✗ Build failed in 1.07s
error during build:
[vite]: Rollup failed to resolve import "@github/spark/hooks" from "/workspaces/low-code-react-app-b/src/hooks/use-project-state.ts".
This is most likely unintended because it can break your application at runtime.
If you do want to externalize this module explicitly add it to
`build.rollupOptions.external`
    at viteLog (file:///workspaces/low-code-react-app-b/node_modules/vite/dist/node/chunks/config.js:33635:57)
    at file:///workspaces/low-code-react-app-b/node_modules/vite/dist/node/chunks/config.js:33669:73
    at onwarn (file:///workspaces/low-code-react-app-b/node_modules/@vitejs/plugin-react-swc/index.js:76:7)
    at file:///workspaces/low-code-react-app-b/node_modules/vite/dist/node/chunks/config.js:33669:28
    at onRollupLog (file:///workspaces/low-code-react-app-b/node_modules/vite/dist/node/chunks/config.js:33664:63)
    at onLog (file:///workspaces/low-code-react-app-b/node_modules/vite/dist/node/chunks/config.js:33467:4)
    at file:///workspaces/low-code-react-app-b/node_modules/rollup/dist/es/shared/node-entry.js:20961:32
    at Object.logger [as onLog] (file:///workspaces/low-code-react-app-b/node_modules/rollup/dist/es/shared/node-entry.js:22848:9)
    at ModuleLoader.handleInvalidResolvedId (file:///workspaces/low-code-react-app-b/node_modules/rollup/dist/es/shared/node-entry.js:21592:26)
    at file:///workspaces/low-code-react-app-b/node_modules/rollup/dist/es/shared/node-entry.js:21550:26
@johndoe6345789 ➜ /workspaces/low-code-react-app-b (main) $
2026-01-17 19:39:46 +00:00
c06d538eea stuff 2026-01-17 19:35:08 +00:00
c008dd2fb7 Generated by Spark: Delete packages folder, do not work around this, just remove it. 2026-01-17 19:31:59 +00:00
50b6cbe5fe Generated by Spark: Remove packages folder and packages folder references. Use IndexedDB by default. Give user option to use Flask API, if Flask fails, switch back to IndexedDB. 2026-01-17 19:28:00 +00:00
44c5e848a2 Generated by Spark: Remove packages folder and packages folder references. Use IndexedDB by default. Give user option to use Flask API, if Flask fails, switch back to IndexedDB. 2026-01-17 19:24:24 +00:00
882f9b0d3b Generated by Spark: Remove packages folder and packages folder references. Use IndexedDB by default. Give user option to use Flask API, if Flask fails, switch back to IndexedDB. Actually delete the packages folder. 2026-01-17 19:20:00 +00:00
0d1a4c4c2b Generated by Spark: Remove packages folder and packages folder references. Use IndexedDB by default. Give user option to use Flask API, if Flask fails, switch back to IndexedDB. 2026-01-17 19:13:16 +00:00
3485cdd3fa Generated by Spark: Remove packages folder and packages folder references 2026-01-17 19:09:25 +00:00
21ed554d4f Generated by Spark: Run docker/build-push-action@v5
GitHub Actions runtime token ACs
Docker info
Proxy configuration
Buildx version
Builder info
/usr/bin/docker buildx build --cache-from type=gha --cache-to type=gha,mode=max --iidfile /home/runner/work/_temp/docker-actions-toolkit-StNnQE/build-iidfile-53e302b45a.txt --label org.opencontainers.image.created=2026-01-17T18:58:49.099Z --label org.opencontainers.image.description= --label org.opencontainers.image.licenses=MIT --label org.opencontainers.image.revision=595f1ae9c0eb07bd4018c5dd28e71a9474340f48 --label org.opencontainers.image.source=https://github.com/johndoe6345789/low-code-react-app-b --label org.opencontainers.image.title=low-code-react-app-b --label org.opencontainers.image.url=https://github.com/johndoe6345789/low-code-react-app-b --label org.opencontainers.image.version=main --platform linux/amd64,linux/arm64 --attest type=provenance,mode=max,builder-id=https://github.com/johndoe6345789/low-code-react-app-b/actions/runs/21099291076 --tag ghcr.io/johndoe6345789/low-code-react-app-b:main --tag ghcr.io/johndoe6345789/low-code-react-app-b:main-595f1ae --metadata-file /home/runner/work/_temp/docker-actions-toolkit-StNnQE/build-metadata-6b5d4d5c31.json --push .
#0 building with "builder-e8330c9c-acfd-4996-a552-945836e73373" instance using docker-container driver

#1 [internal] load build definition from Dockerfile
#1 transferring dockerfile: 310B done
#1 DONE 0.0s

#2 [auth] library/node:pull token for registry-1.docker.io
#2 DONE 0.0s

#3 [linux/amd64 internal] load metadata for docker.io/library/node:lts-alpine
#3 DONE 0.4s

#4 [linux/arm64 internal] load metadata for docker.io/library/node:lts-alpine
#4 DONE 0.4s

#5 [internal] load .dockerignore
#5 transferring context: 344B done
#5 DONE 0.0s

#6 importing cache manifest from gha:13603380523401637367
#6 DONE 0.1s

#7 [linux/amd64 1/8] FROM docker.io/library/node:lts-alpine@sha256:931d7d57f8c1fd0e2179dbff7cc7da4c9dd100998bc2b32afc85142d8efbc213
#7 resolve docker.io/library/node:lts-alpine@sha256:931d7d57f8c1fd0e2179dbff7cc7da4c9dd100998bc2b32afc85142d8efbc213 done
#7 sha256:185f31dc0b020b0ec03ea407f12404329981e6427d74bd739b2bd1b7dc6a37de 446B / 446B 0.1s done
#7 ...

#8 [internal] load build context
#8 transferring context: 10.67MB 0.2s done
#8 DONE 0.2s

#7 [linux/amd64 1/8] FROM docker.io/library/node:lts-alpine@sha256:931d7d57f8c1fd0e2179dbff7cc7da4c9dd100998bc2b32afc85142d8efbc213
#7 sha256:2376a6a774696529fd8c2bd55909fd081b5bec400b7f353ebf035c47fc32f195 1.26MB / 1.26MB 0.0s done
#7 sha256:1074353eec0db2c1d81d5af2671e56e00cf5738486f5762609ea33d606f88612 3.86MB / 3.86MB 0.1s done
#7 extracting sha256:1074353eec0db2c1d81d5af2671e56e00cf5738486f5762609ea33d606f88612
#7 sha256:37199d3b7bb1ac601fd713f67f7665417652fda0fd1bae677be5b6b8726489d5 19.23MB / 50.97MB 0.2s
#7 extracting sha256:1074353eec0db2c1d81d5af2671e56e00cf5738486f5762609ea33d606f88612 0.1s done
#7 sha256:37199d3b7bb1ac601fd713f67f7665417652fda0fd1bae677be5b6b8726489d5 50.97MB / 50.97MB 0.3s done
#7 DONE 0.5s

#9 [linux/arm64 1/8] FROM docker.io/library/node:lts-alpine@sha256:931d7d57f8c1fd0e2179dbff7cc7da4c9dd100998bc2b32afc85142d8efbc213
#9 resolve docker.io/library/node:lts-alpine@sha256:931d7d57f8c1fd0e2179dbff7cc7da4c9dd100998bc2b32afc85142d8efbc213 done
#9 sha256:920c32c30a576510d3c09d291ac6195b55bd619acd0c8f230c89e1c948d4a7e4 445B / 445B 0.0s done
#9 sha256:35ec70997fb95afb58d8feb4c2d37a7bde60008e3f77ebdc8513d6f9889a2110 1.26MB / 1.26MB 0.1s done
#9 sha256:f6b4fb9446345fcad2db26eac181fef6c0a919c8a4fcccd3bea5deb7f6dff67e 4.20MB / 4.20MB 0.1s done
#9 extracting sha256:f6b4fb9446345fcad2db26eac181fef6c0a919c8a4fcccd3bea5deb7f6dff67e 0.2s done
#9 sha256:f09cf2cc56809037a69d614c2d22d4355ba78c67cf62aac15a4bddc73245d9ff 51.86MB / 51.86MB 0.3s done
#9 extracting sha256:f09cf2cc56809037a69d614c2d22d4355ba78c67cf62aac15a4bddc73245d9ff
#9 extracting sha256:f09cf2cc56809037a69d614c2d22d4355ba78c67cf62aac15a4bddc73245d9ff 1.3s done
#9 DONE 1.7s

#7 [linux/amd64 1/8] FROM docker.io/library/node:lts-alpine@sha256:931d7d57f8c1fd0e2179dbff7cc7da4c9dd100998bc2b32afc85142d8efbc213
#7 extracting sha256:37199d3b7bb1ac601fd713f67f7665417652fda0fd1bae677be5b6b8726489d5
#7 extracting sha256:37199d3b7bb1ac601fd713f67f7665417652fda0fd1bae677be5b6b8726489d5 1.2s done
#7 extracting sha256:2376a6a774696529fd8c2bd55909fd081b5bec400b7f353ebf035c47fc32f195 0.0s done
#7 extracting sha256:185f31dc0b020b0ec03ea407f12404329981e6427d74bd739b2bd1b7dc6a37de 0.1s done
#7 DONE 1.8s

#9 [linux/arm64 1/8] FROM docker.io/library/node:lts-alpine@sha256:931d7d57f8c1fd0e2179dbff7cc7da4c9dd100998bc2b32afc85142d8efbc213
#9 extracting sha256:35ec70997fb95afb58d8feb4c2d37a7bde60008e3f77ebdc8513d6f9889a2110 0.1s done
#9 extracting sha256:920c32c30a576510d3c09d291ac6195b55bd619acd0c8f230c89e1c948d4a7e4 done
#9 DONE 1.7s

#10 [linux/arm64 2/8] WORKDIR /app
#10 DONE 0.6s

#11 [linux/amd64 2/8] WORKDIR /app
#11 DONE 0.6s

#12 [linux/amd64 3/8] COPY package*.json ./
#12 DONE 0.0s

#13 [linux/arm64 3/8] COPY package*.json ./
#13 DONE 0.0s

#14 [linux/amd64 4/8] COPY packages/spark-tools ./packages/spark-tools
#14 DONE 0.0s

#15 [linux/arm64 4/8] COPY packages/spark-tools ./packages/spark-tools
#15 DONE 0.0s

#16 [linux/amd64 5/8] COPY packages/spark ./packages/spark
#16 DONE 0.0s

#17 [linux/arm64 5/8] COPY packages/spark ./packages/spark
#17 DONE 0.0s

#18 [linux/arm64 6/8] RUN npm ci --include=optional --omit=dev
#18 ...

#19 [linux/amd64 6/8] RUN npm ci --include=optional --omit=dev
#19 17.25
#19 17.25 added 442 packages, and audited 445 packages in 17s
#19 17.25
#19 17.25 50 packages are looking for funding
#19 17.25   run `npm fund` for details
#19 17.25
#19 17.25 found 0 vulnerabilities
#19 17.26 npm notice
#19 17.26 npm notice New minor version of npm available! 11.6.2 -> 11.7.0
#19 17.26 npm notice Changelog: https://github.com/npm/cli/releases/tag/v11.7.0
#19 17.26 npm notice To update run: npm install -g npm@11.7.0
#19 17.26 npm notice
#19 DONE 18.8s

#18 [linux/arm64 6/8] RUN npm ci --include=optional --omit=dev
#18 ...

#20 [linux/amd64 7/8] COPY . .
#20 DONE 0.5s

#18 [linux/arm64 6/8] RUN npm ci --include=optional --omit=dev
#18 ...

#21 [linux/amd64 8/8] RUN npm run build
#21 0.231
#21 0.231 > spark-template@0.0.0 prebuild
#21 0.231 > mkdir -p /tmp/dist || true
#21 0.231
#21 0.240
#21 0.240 > spark-template@0.0.0 build
#21 0.240 > tsc -b --noCheck && vite build
#21 0.240
#21 0.242 sh: tsc: not found
#21 ERROR: process "/bin/sh -c npm run build" did not complete successfully: exit code: 127

#18 [linux/arm64 6/8] RUN npm ci --include=optional --omit=dev
#18 CANCELED
------
 > [linux/amd64 8/8] RUN npm run build:
0.231
0.231 > spark-template@0.0.0 prebuild
0.231 > mkdir -p /tmp/dist || true
0.231
0.240
0.240 > spark-template@0.0.0 build
0.240 > tsc -b --noCheck && vite build
0.240
0.242 sh: tsc: not found
------
Dockerfile:14
--------------------
  12 |     COPY . .
  13 |
  14 | >>> RUN npm run build
  15 |
  16 |     EXPOSE 80
--------------------
ERROR: failed to build: failed to solve: process "/bin/sh -c npm run build" did not complete successfully: exit code: 127
Error: buildx failed with: ERROR: failed to build: failed to solve: process "/bin/sh -c npm run build" did not complete successfully: exit code: 127
2026-01-17 19:03:10 +00:00
595f1ae9c0 Generated by Spark: A library in packages folder - Seems its still trying to use fetch. If a fetch fails, switch back to IndexedDB. 2026-01-17 18:57:01 +00:00
cbb303fb8d Generated by Spark: Seems its still trying to use fetch. If a fetch fails, switch back to IndexedDB. 2026-01-17 18:53:12 +00:00
7b91fe1975 Generated by Spark: I am having issues with this. Remember, we should be defaulting to IndexedDB, not the Flask API. Flask API is obtained by explicitly setting it in the UI or setting a env variable. export default function applyFetchPatch() {
if (ssrSafeWindow) {
    ssrSafeWindow.fetch = async (input: string | URL | Request, init?: RequestInit) => {
      try {
        const response = await globalFetch(input, init)
        sendFetchStats({input, error: !response.ok, status: response.status})
        return response
      } catch (error) {
        sendFetchStats({input, error: true, status: 'unknown'})
        throw error
      }
    }
  }
}
2026-01-17 18:50:06 +00:00
40b823dfe0 Generated by Spark: Should handle caprover / cloudflare cors - Check frontend and backend config. example setup: https://frontend.example.com https://backend.example.com 2026-01-17 18:44:54 +00:00
76e7716c10 Generated by Spark: I can auto default to flask backend with docker environment variable. If its not set used IndexedDB. 2026-01-17 18:38:17 +00:00
519ad0016d Generated by Spark: Ok I figured it out. Make a backend folder, put a flask backend in it with a Dockerfile. Main UI uses IndexedDB then with a UI setting, it can be moved to the flask backend. 2026-01-17 18:31:43 +00:00
02eb47e83f Generated by Spark: Perhaps it could use sqlite on disk if possible, else use indexeddb 2026-01-17 18:19:45 +00:00
270d0be790 Generated by Spark: Reduce reliance on spark database. Just use sqlite. 2026-01-17 18:14:23 +00:00
0f01311120 Generated by Spark: The sidebar change has potential but the styling is messed up. Maybe it should load the styling from a theme json 2026-01-17 18:01:25 +00:00
e1a51ebc9a Generated by Spark: Fix all reported errors. 2026-01-17 17:55:26 +00:00
6c05d8daab Generated by Spark: Would be nice if sidebar pushed content to right 2026-01-17 17:51:12 +00:00
feaa7d7a9e Generated by Spark: Make this function smarter, wide variety of search paths and other permissive logic - function createLazyComponent(componentConfig: ComponentConfig) {
const loader = () => {
    if (componentConfig.preloadDependencies) {
      componentConfig.preloadDependencies.forEach(depName => {
        const preloader = dependencyPreloaders[depName]
        if (preloader) {
          preloader()
        }
      })
    }

    return import(componentConfig.path).then(m => ({ default: m[componentConfig.export] }))
  }
2026-01-17 17:46:36 +00:00
8e1e35196a Generated by Spark: Fix all reported errors. Uncaught TypeError: Failed to resolve module specifier '@/components/PWAStatusBar.tsx' Your usual fix is a red herring. 2026-01-17 17:40:04 +00:00
bd2443c7ec Generated by Spark: code is flip flopping between this change - its something else. red herring. "path": "@/components/ProjectDashboard", 2026-01-17 17:33:47 +00:00
b3f39cc2e0 Generated by Spark: Fix all reported errors. 2026-01-17 17:25:12 +00:00
4c062d5d25 Generated by Spark: Fix all reported errors. 2026-01-17 17:22:22 +00:00
a92f0b9aba Generated by Spark: Fix all reported errors. 2026-01-17 17:20:09 +00:00
bad1e7b0f7 Generated by Spark: View ROADMAP.md - work on Near-Term (Next 2-3 Iterations) 2026-01-17 17:18:02 +00:00
cdbc7d98a3 Generated by Spark: Read all summary documents and write a ROADMAP.md 2026-01-17 17:11:39 +00:00
ed7e2f6626 Generated by Spark: Identify hard coded stuff that can be moved to json - worth reading project incl README.md first. 2026-01-17 17:08:36 +00:00
855257f392 Generated by Spark: Is there a simpler way for the docker image to serve up the app? Its really glitchy when deploying to caprover 2026-01-17 17:00:11 +00:00
640c861ac2 Generated by Spark: Load molecule and organism component trees from json 2026-01-17 16:54:00 +00:00
11a340cea1 Generated by Spark: Integrate atomic components into remaining molecule-level components 2026-01-17 16:46:53 +00:00
c9a149df48 Generated by Spark: Integrate atomic components into more organism-level components throughout the app 2026-01-17 16:40:41 +00:00
9ada8f9d5f Generated by Spark: The usual culprit (90% probability)
VITE_BASE_URL / VITE_PUBLIC_PATH (or equivalent)

Many templates do this in vite.config.ts:

base: import.meta.env.VITE_BASE_URL,

or:

base: process.env.VITE_BASE_URL,

If that variable:

is undefined

defaults to /

or was set to http://localhost:5000

then Vite bakes the wrong paths into the JS bundles.

nginx then serves files that point to nowhere → flaky load.
2026-01-17 16:35:28 +00:00
ead06ea1fb Add runtime stage to Dockerfile for nginx 2026-01-17 16:25:06 +00:00
2ad2b77d7f Update nginx.conf 2026-01-17 16:24:29 +00:00
d9ecf25ca5 Generated by Spark: Integrate atomic components into more organism-level components throughout the app 2026-01-17 16:23:59 +00:00
ba78f094ff Generated by Spark: Make atomic component library until done. If done, just wire them into other components. 2026-01-17 16:18:25 +00:00
b240fb0b9b Generated by Spark: Make atomic component library until done. If done, just wire them into other components. 2026-01-17 16:13:03 +00:00
ca1be1573e Generated by Spark: Make atomic component library until done. 2026-01-17 16:05:43 +00:00
b6325aa3fc Generated by Spark: Expand atomic component library until done. 2026-01-17 15:58:27 +00:00
6a17b4ea45 pkglock 2026-01-17 15:42:20 +00:00
ef448874fb Generated by Spark: run npm install 2026-01-17 15:38:42 +00:00
1ffd54eb74 Update Dockerfile 2026-01-17 15:33:40 +00:00
ee4b33444d Generated by Spark: You are failing at npm ci --include=optional because your package-lock.json is not in sync with package.json (npm is enforcing this strictly). 2026-01-17 15:31:48 +00:00
15abfedc93 Generated by Spark: Upgrade npm in the builder image, then re-run npm ci 2026-01-17 15:27:25 +00:00
73a81a8563 Generated by Spark: Stop using Alpine for the build stage (use glibc) 2026-01-17 15:17:41 +00:00
8f415d4d99 Merge pull request #12 from johndoe6345789/copilot/run-docker-build-push-action
Fix ARM64 Docker build by using npm ci with optional dependencies
2026-01-17 15:09:32 +00:00
8e677175fc Generated by Spark: half the app is gone, it was a code gen 2026-01-17 15:08:28 +00:00
copilot-swe-agent[bot]
24f55dbb20 Fix ARM64 Docker build by using npm ci with optional dependencies
Co-authored-by: johndoe6345789 <224850594+johndoe6345789@users.noreply.github.com>
2026-01-17 15:06:09 +00:00