Commit Graph

280 Commits

Author SHA1 Message Date
copilot-swe-agent[bot]
4a78002ac2 Initial plan 2026-01-17 21:45:50 +00:00
b6ca631c9b Merge pull request #14 from johndoe6345789/copilot/remove-duplicate-json-components
Remove duplicate components, prefer JSON versions
2026-01-17 21:45:23 +00:00
copilot-swe-agent[bot]
a8dea7cd42 Remove legacy App files and obsolete documentation
Co-authored-by: johndoe6345789 <224850594+johndoe6345789@users.noreply.github.com>
2026-01-17 21:36:46 +00:00
copilot-swe-agent[bot]
716e38a324 Add documentation for legacy App files
Co-authored-by: johndoe6345789 <224850594+johndoe6345789@users.noreply.github.com>
2026-01-17 21:34:18 +00:00
copilot-swe-agent[bot]
d4512a3e98 Remove duplicate components, prefer JSON versions
Co-authored-by: johndoe6345789 <224850594+johndoe6345789@users.noreply.github.com>
2026-01-17 21:31:38 +00:00
copilot-swe-agent[bot]
0de5594357 Initial plan 2026-01-17 21:24:47 +00:00
321e6c8d04 Generated by Spark: Remove duplicate components, prefer json version. 2026-01-17 21:23:16 +00:00
613450f8fb Generated by Spark: Remove duplicate components, prefer json version. 2026-01-17 21:18:55 +00:00
98f4b49edf Generated by Spark: Create Redux persistence middleware to sync state with database automatically 2026-01-17 21:14:31 +00:00
45454ac34b Generated by Spark: Add conflict resolution UI for handling sync conflicts between local and remote data 2026-01-17 21:08:10 +00:00
f58c7ac857 Generated by Spark: Implement Redux, Integrate redux into IndexedDB / Flask API system. 2026-01-17 21:01:23 +00:00
98b11d34e9 Generated by Spark: 1. Convert stuff to JSON Component Trees. 2. Make atomic components for said component trees. 2026-01-17 20:54:33 +00:00
0b9754e0f6 Remove obsolete documentation and scripts related to theme configuration, troubleshooting, and linting processes
- Deleted THEME_JSON_SYSTEM.md, TROUBLESHOOTING.md, and VERIFICATION_CHECKLIST.md as they are no longer relevant.
- Removed docker-entrypoint.sh, fix-node-modules.sh, procedural-lint-fix.sh, quick-lint-check.sh, run-lint-verification.sh, validate-tests.sh, and verify-lint.sh scripts that are outdated or redundant.
2026-01-17 20:43:12 +00:00
693001543b stuff 2026-01-17 20:41:48 +00:00
cd19aef42f Generated by Spark: Breadcrumb should probably have a clear / reset button 2026-01-17 20:36:01 +00:00
23465084e0 Generated by Spark: Expand All, Collapse All buttons are too big, Maybe just use icon buttons? Don't really need the text, tooltip would be nice. 2026-01-17 20:29:59 +00:00
c7ebafe6a8 Generated by Spark: IndexedDB and Flask replaces Spark. Fix this please. Run npx tsc --noEmit
Error: src/components/FeatureIdeaCloud.tsx(920,37): error TS2339: Property 'spark' does not exist on type 'Window & typeof globalThis'.
Error: src/components/PlaywrightDesigner.tsx(105,37): error TS2339: Property 'spark' does not exist on type 'Window & typeof globalThis'.
Error: src/components/StorybookDesigner.tsx(115,37): error TS2339: Property 'spark' does not exist on type 'Window & typeof globalThis'.
Error: src/components/TemplateExplorer.tsx(40,31): error TS2339: Property 'spark' does not exist on type 'Window & typeof globalThis'.
Error: src/components/TemplateExplorer.tsx(44,32): error TS2339: Property 'spark' does not exist on type 'Window & typeof globalThis'.
Error: src/components/UnitTestDesigner.tsx(138,37): error TS2339: Property 'spark' does not exist on type 'Window & typeof globalThis'.
Error: src/hooks/data/use-seed-data.ts(23,19): error TS2339: Property 'spark' does not exist on type 'Window & typeof globalThis'.
Error: src/hooks/data/use-seed-data.ts(30,33): error TS2339: Property 'spark' does not exist on type 'Window & typeof globalThis'.
Error: src/hooks/data/use-seed-data.ts(40,24): error TS2339: Property 'spark' does not exist on type 'Window & typeof globalThis'.
Error: src/hooks/data/use-seed-data.ts(63,19): error TS2339: Property 'spark' does not exist on type 'Window & typeof globalThis'.
Error: src/hooks/data/use-seed-data.ts(71,22): error TS2339: Property 'spark' does not exist on type 'Window & typeof globalThis'.
Error: src/hooks/data/use-seed-data.ts(86,19): error TS2339: Property 'spark' does not exist on type 'Window & typeof globalThis'.
Error: src/hooks/data/use-seed-data.ts(91,33): error TS2339: Property 'spark' does not exist on type 'Window & typeof globalThis'.
Error: src/hooks/data/use-seed-data.ts(95,22): error TS2339: Property 'spark' does not exist on type 'Window & typeof globalThis'.
Error: src/hooks/data/use-seed-templates.ts(87,22): error TS2339: Property 'spark' does not exist on type 'Window & typeof globalThis'.
Error: src/hooks/data/use-seed-templates.ts(102,33): error TS2339: Property 'spark' does not exist on type 'Window & typeof globalThis'.
Error: src/hooks/data/use-seed-templates.ts(104,22): error TS2339: Property 'spark' does not exist on type 'Window & typeof globalThis'.
Error: src/hooks/data/use-seed-templates.ts(126,43): error TS2339: Property 'spark' does not exist on type 'Window & typeof globalThis'.
Error: src/hooks/data/use-seed-templates.ts(129,24): error TS2339: Property 'spark' does not exist on type 'Window & typeof globalThis'.
Error: src/hooks/data/use-seed-templates.ts(131,24): error TS2339: Property 'spark' does not exist on type 'Window & typeof globalThis'.
Error: src/hooks/json-ui/use-data-sources.ts(25,40): error TS2339: Property 'spark' does not exist on type 'Window & typeof globalThis'.
Error: src/hooks/json-ui/use-data-sources.ts(47,20): error TS2339: Property 'spark' does not exist on type 'Window & typeof globalThis'.
Error: src/hooks/orchestration/use-actions.ts(119,31): error TS2339: Property 'spark' does not exist on type 'Window & typeof globalThis'.
Error: src/hooks/use-component-tree-loader.ts(22,19): error TS2339: Property 'spark' does not exist on type 'Window & typeof globalThis'.
Error: src/hooks/use-component-tree-loader.ts(27,42): error TS2339: Property 'spark' does not exist on type 'Window & typeof globalThis'.
Error: src/hooks/use-component-tree-loader.ts(31,22): error TS2339: Property 'spark' does not exist on type 'Window & typeof globalThis'.
Error: src/hooks/use-component-tree-loader.ts(43,24): error TS2339: Property 'spark' does not exist on type 'Window & typeof globalThis'.
Error: src/hooks/use-component-tree-loader.ts(60,17): error TS2339: Property 'spark' does not exist on type 'Window & typeof globalThis'.
Error: src/hooks/use-component-tree-loader.ts(65,32): error TS2339: Property 'spark' does not exist on type 'Window & typeof globalThis'.
Error: src/hooks/use-component-tree-loader.ts(90,19): error TS2339: Property 'spark' does not exist on type 'Window & typeof globalThis'.
Error: src/hooks/use-component-tree-loader.ts(94,20): error TS2339: Property 'spark' does not exist on type 'Window & typeof globalThis'.
Error: src/lib/project-service.ts(21,39): error TS2339: Property 'spark' does not exist on type 'Window & typeof globalThis'.
Error: src/lib/project-service.ts(61,18): error TS2339: Property 'spark' does not exist on type 'Window & typeof globalThis'.
Error: src/lib/project-service.ts(63,38): error TS2339: Property 'spark' does not exist on type 'Window & typeof globalThis'.
Error: src/lib/project-service.ts(66,20): error TS2339: Property 'spark' does not exist on type 'Window & typeof globalThis'.
Error: src/lib/project-service.ts(74,36): error TS2339: Property 'spark' does not exist on type 'Window & typeof globalThis'.
Error: src/lib/project-service.ts(83,18): error TS2339: Property 'spark' does not exist on type 'Window & typeof globalThis'.
Error: src/lib/project-service.ts(85,38): error TS2339: Property 'spark' does not exist on type 'Window & typeof globalThis'.
Error: src/lib/project-service.ts(87,18): error TS2339: Property 'spark' does not exist on type 'Window & typeof globalThis'.
Error: src/lib/protected-llm-service.ts(39,31): error TS2339: Property 'spark' does not exist on type 'Window & typeof globalThis'.
Error: src/lib/unified-storage.ts(199,17): error TS2339: Property 'spark' does not exist on type 'Window & typeof globalThis'.
Error: src/lib/unified-storage.ts(200,25): error TS2339: Property 'spark' does not exist on type 'Window & typeof globalThis'.
Error: src/lib/unified-storage.ts(204,17): error TS2339: Property 'spark' does not exist on type 'Window & typeof globalThis'.
Error: src/lib/unified-storage.ts(205,18): error TS2339: Property 'spark' does not exist on type 'Window & typeof globalThis'.
Error: src/lib/unified-storage.ts(209,17): error TS2339: Property 'spark' does not exist on type 'Window & typeof globalThis'.
Error: src/lib/unified-storage.ts(210,18): error TS2339: Property 'spark' does not exist on type 'Window & typeof globalThis'.
Error: src/lib/unified-storage.ts(214,17): error TS2339: Property 'spark' does not exist on type 'Window & typeof globalThis'.
Error: src/lib/unified-storage.ts(215,25): error TS2339: Property 'spark' does not exist on type 'Window & typeof globalThis'.
Error: src/lib/unified-storage.ts(219,17): error TS2339: Property 'spark' does not exist on type 'Window & typeof globalThis'.
Error: src/lib/unified-storage.ts(220,34): error TS2339: Property 'spark' does not exist on type 'Window & typeof globalThis'.
Error: src/lib/unified-storage.ts(221,49): error TS2339: Property 'spark' does not exist on type 'Window & typeof globalThis'.
Error: src/lib/unified-storage.ts(409,18): error TS2339: Property 'spark' does not exist on type 'Window & typeof globalThis'.
Error: Process completed with exit code 2.
2026-01-17 20:19:44 +00:00
c3029577de Merge pull request #13 from johndoe6345789/dependabot/pip/backend/pip-533d40512c
Bump the pip group across 1 directory with 2 updates
2026-01-17 20:14:06 +00:00
9b13431bd3 Generated by Spark: Expand All, Collapse All buttons are too big, Maybe just use icon buttons? 2026-01-17 20:09:04 +00:00
c14aeeec87 Generated by Spark: maybe we can fix these warnings: @johndoe6345789 ➜ /workspaces/spark-template (main) $ npm run dev
> spark-template@0.0.0 dev
> vite

7:54:09 PM [vite] (client) Re-optimizing dependencies because lockfile has changed
Port 5000 is in use, trying another one...

  VITE v7.3.1  ready in 381 ms

  ➜  Local:   http://localhost:5001/
  ➜  Network: http://10.0.0.185:5001/
  ➜  press h + enter to show help
7:54:16 PM [vite] (client) warning:
/workspaces/spark-template/src/lib/component-registry.ts
21 |          for (const path of pathVariants){
22 |              try {
23 |                  const module = await import(path);
   |                                              ^^^^
24 |                  const exportName = componentConfig.export || 'default';
25 |                  if (module[exportName]) {
The above dynamic import cannot be analyzed by Vite.
See https://github.com/rollup/plugins/tree/master/packages/dynamic-import-vars#limitations for supported dynamic import formats. If this is intended to be left as-is, you can use the /* @vite-ignore */ comment inside the import() call to suppress this warning.

  Plugin: vite:import-analysis
  File: /workspaces/spark-template/src/lib/component-registry.ts
7:54:16 PM [vite] (client) warning:
/workspaces/spark-template/src/lib/unified-storage.ts
201 |          const moduleName = 'sql.js';
202 |          try {
203 |              return await import(moduleName);
    |                                  ^^^^^^^^^^
204 |          } catch  {
205 |              throw new Error(`${moduleName} not installed. Run: npm install ${moduleName}`);
The above dynamic import cannot be analyzed by Vite.
See https://github.com/rollup/plugins/tree/master/packages/dynamic-import-vars#limitations for supported dynamic import formats. If this is intended to be left as-is, you can use the /* @vite-ignore */ comment inside the import() call to suppress this warning.

  Plugin: vite:import-analysis
  File: /workspaces/spark-template/src/lib/unified-storage.ts
7:54:23 PM [vite] (client)  new dependencies optimized: framer-motion
7:54:23 PM [vite] (client)  optimized dependencies changed. reloading
7:54:23 PM [vite] (client) warning:
/workspaces/spark-template/src/lib/component-registry.ts
21 |          for (const path of pathVariants){
22 |              try {
23 |                  const module = await import(path);
   |                                              ^^^^
24 |                  const exportName = componentConfig.export || 'default';
25 |                  if (module[exportName]) {
The above dynamic import cannot be analyzed by Vite.
See https://github.com/rollup/plugins/tree/master/packages/dynamic-import-vars#limitations for supported dynamic import formats. If this is intended to be left as-is, you can use the /* @vite-ignore */ comment inside the import() call to suppress this warning.

  Plugin: vite:import-analysis
  File: /workspaces/spark-template/src/lib/component-registry.ts
7:54:24 PM [vite] (client) warning:
/workspaces/spark-template/src/lib/unified-storage.ts
201 |          const moduleName = 'sql.js';
202 |          try {
203 |              return await import(moduleName);
    |                                  ^^^^^^^^^^
204 |          } catch  {
205 |              throw new Error(`${moduleName} not installed. Run: npm install ${moduleName}`);
The above dynamic import cannot be analyzed by Vite.
See https://github.com/rollup/plugins/tree/master/packages/dynamic-import-vars#limitations for supported dynamic import formats. If this is intended to be left as-is, you can use the /* @vite-ignore */ comment inside the import() call to suppress this warning.

  Plugin: vite:import-analysis
  File: /workspaces/spark-template/src/lib/unified-storage.ts
2026-01-17 20:01:58 +00:00
4dfded3533 Generated by Spark: @johndoe6345789 ➜ /workspaces/low-code-react-app-b (main) $ npm run build
> spark-template@0.0.0 prebuild
> mkdir -p /tmp/dist || true

> spark-template@0.0.0 build
> tsc -b --noCheck && vite build

vite v7.3.1 building client environment for production...
<script src="/runtime-config.js"> in "/index.html" can't be bundled without type="module" attribute
✓ 37 modules transformed.
✗ Build failed in 1.07s
error during build:
[vite]: Rollup failed to resolve import "@github/spark/hooks" from "/workspaces/low-code-react-app-b/src/hooks/use-project-state.ts".
This is most likely unintended because it can break your application at runtime.
If you do want to externalize this module explicitly add it to
`build.rollupOptions.external`
    at viteLog (file:///workspaces/low-code-react-app-b/node_modules/vite/dist/node/chunks/config.js:33635:57)
    at file:///workspaces/low-code-react-app-b/node_modules/vite/dist/node/chunks/config.js:33669:73
    at onwarn (file:///workspaces/low-code-react-app-b/node_modules/@vitejs/plugin-react-swc/index.js:76:7)
    at file:///workspaces/low-code-react-app-b/node_modules/vite/dist/node/chunks/config.js:33669:28
    at onRollupLog (file:///workspaces/low-code-react-app-b/node_modules/vite/dist/node/chunks/config.js:33664:63)
    at onLog (file:///workspaces/low-code-react-app-b/node_modules/vite/dist/node/chunks/config.js:33467:4)
    at file:///workspaces/low-code-react-app-b/node_modules/rollup/dist/es/shared/node-entry.js:20961:32
    at Object.logger [as onLog] (file:///workspaces/low-code-react-app-b/node_modules/rollup/dist/es/shared/node-entry.js:22848:9)
    at ModuleLoader.handleInvalidResolvedId (file:///workspaces/low-code-react-app-b/node_modules/rollup/dist/es/shared/node-entry.js:21592:26)
    at file:///workspaces/low-code-react-app-b/node_modules/rollup/dist/es/shared/node-entry.js:21550:26
@johndoe6345789 ➜ /workspaces/low-code-react-app-b (main) $
2026-01-17 19:39:46 +00:00
c06d538eea stuff 2026-01-17 19:35:08 +00:00
c008dd2fb7 Generated by Spark: Delete packages folder, do not work around this, just remove it. 2026-01-17 19:31:59 +00:00
50b6cbe5fe Generated by Spark: Remove packages folder and packages folder references. Use IndexedDB by default. Give user option to use Flask API, if Flask fails, switch back to IndexedDB. 2026-01-17 19:28:00 +00:00
44c5e848a2 Generated by Spark: Remove packages folder and packages folder references. Use IndexedDB by default. Give user option to use Flask API, if Flask fails, switch back to IndexedDB. 2026-01-17 19:24:24 +00:00
882f9b0d3b Generated by Spark: Remove packages folder and packages folder references. Use IndexedDB by default. Give user option to use Flask API, if Flask fails, switch back to IndexedDB. Actually delete the packages folder. 2026-01-17 19:20:00 +00:00
0d1a4c4c2b Generated by Spark: Remove packages folder and packages folder references. Use IndexedDB by default. Give user option to use Flask API, if Flask fails, switch back to IndexedDB. 2026-01-17 19:13:16 +00:00
3485cdd3fa Generated by Spark: Remove packages folder and packages folder references 2026-01-17 19:09:25 +00:00
21ed554d4f Generated by Spark: Run docker/build-push-action@v5
GitHub Actions runtime token ACs
Docker info
Proxy configuration
Buildx version
Builder info
/usr/bin/docker buildx build --cache-from type=gha --cache-to type=gha,mode=max --iidfile /home/runner/work/_temp/docker-actions-toolkit-StNnQE/build-iidfile-53e302b45a.txt --label org.opencontainers.image.created=2026-01-17T18:58:49.099Z --label org.opencontainers.image.description= --label org.opencontainers.image.licenses=MIT --label org.opencontainers.image.revision=595f1ae9c0eb07bd4018c5dd28e71a9474340f48 --label org.opencontainers.image.source=https://github.com/johndoe6345789/low-code-react-app-b --label org.opencontainers.image.title=low-code-react-app-b --label org.opencontainers.image.url=https://github.com/johndoe6345789/low-code-react-app-b --label org.opencontainers.image.version=main --platform linux/amd64,linux/arm64 --attest type=provenance,mode=max,builder-id=https://github.com/johndoe6345789/low-code-react-app-b/actions/runs/21099291076 --tag ghcr.io/johndoe6345789/low-code-react-app-b:main --tag ghcr.io/johndoe6345789/low-code-react-app-b:main-595f1ae --metadata-file /home/runner/work/_temp/docker-actions-toolkit-StNnQE/build-metadata-6b5d4d5c31.json --push .
#0 building with "builder-e8330c9c-acfd-4996-a552-945836e73373" instance using docker-container driver

#1 [internal] load build definition from Dockerfile
#1 transferring dockerfile: 310B done
#1 DONE 0.0s

#2 [auth] library/node:pull token for registry-1.docker.io
#2 DONE 0.0s

#3 [linux/amd64 internal] load metadata for docker.io/library/node:lts-alpine
#3 DONE 0.4s

#4 [linux/arm64 internal] load metadata for docker.io/library/node:lts-alpine
#4 DONE 0.4s

#5 [internal] load .dockerignore
#5 transferring context: 344B done
#5 DONE 0.0s

#6 importing cache manifest from gha:13603380523401637367
#6 DONE 0.1s

#7 [linux/amd64 1/8] FROM docker.io/library/node:lts-alpine@sha256:931d7d57f8c1fd0e2179dbff7cc7da4c9dd100998bc2b32afc85142d8efbc213
#7 resolve docker.io/library/node:lts-alpine@sha256:931d7d57f8c1fd0e2179dbff7cc7da4c9dd100998bc2b32afc85142d8efbc213 done
#7 sha256:185f31dc0b020b0ec03ea407f12404329981e6427d74bd739b2bd1b7dc6a37de 446B / 446B 0.1s done
#7 ...

#8 [internal] load build context
#8 transferring context: 10.67MB 0.2s done
#8 DONE 0.2s

#7 [linux/amd64 1/8] FROM docker.io/library/node:lts-alpine@sha256:931d7d57f8c1fd0e2179dbff7cc7da4c9dd100998bc2b32afc85142d8efbc213
#7 sha256:2376a6a774696529fd8c2bd55909fd081b5bec400b7f353ebf035c47fc32f195 1.26MB / 1.26MB 0.0s done
#7 sha256:1074353eec0db2c1d81d5af2671e56e00cf5738486f5762609ea33d606f88612 3.86MB / 3.86MB 0.1s done
#7 extracting sha256:1074353eec0db2c1d81d5af2671e56e00cf5738486f5762609ea33d606f88612
#7 sha256:37199d3b7bb1ac601fd713f67f7665417652fda0fd1bae677be5b6b8726489d5 19.23MB / 50.97MB 0.2s
#7 extracting sha256:1074353eec0db2c1d81d5af2671e56e00cf5738486f5762609ea33d606f88612 0.1s done
#7 sha256:37199d3b7bb1ac601fd713f67f7665417652fda0fd1bae677be5b6b8726489d5 50.97MB / 50.97MB 0.3s done
#7 DONE 0.5s

#9 [linux/arm64 1/8] FROM docker.io/library/node:lts-alpine@sha256:931d7d57f8c1fd0e2179dbff7cc7da4c9dd100998bc2b32afc85142d8efbc213
#9 resolve docker.io/library/node:lts-alpine@sha256:931d7d57f8c1fd0e2179dbff7cc7da4c9dd100998bc2b32afc85142d8efbc213 done
#9 sha256:920c32c30a576510d3c09d291ac6195b55bd619acd0c8f230c89e1c948d4a7e4 445B / 445B 0.0s done
#9 sha256:35ec70997fb95afb58d8feb4c2d37a7bde60008e3f77ebdc8513d6f9889a2110 1.26MB / 1.26MB 0.1s done
#9 sha256:f6b4fb9446345fcad2db26eac181fef6c0a919c8a4fcccd3bea5deb7f6dff67e 4.20MB / 4.20MB 0.1s done
#9 extracting sha256:f6b4fb9446345fcad2db26eac181fef6c0a919c8a4fcccd3bea5deb7f6dff67e 0.2s done
#9 sha256:f09cf2cc56809037a69d614c2d22d4355ba78c67cf62aac15a4bddc73245d9ff 51.86MB / 51.86MB 0.3s done
#9 extracting sha256:f09cf2cc56809037a69d614c2d22d4355ba78c67cf62aac15a4bddc73245d9ff
#9 extracting sha256:f09cf2cc56809037a69d614c2d22d4355ba78c67cf62aac15a4bddc73245d9ff 1.3s done
#9 DONE 1.7s

#7 [linux/amd64 1/8] FROM docker.io/library/node:lts-alpine@sha256:931d7d57f8c1fd0e2179dbff7cc7da4c9dd100998bc2b32afc85142d8efbc213
#7 extracting sha256:37199d3b7bb1ac601fd713f67f7665417652fda0fd1bae677be5b6b8726489d5
#7 extracting sha256:37199d3b7bb1ac601fd713f67f7665417652fda0fd1bae677be5b6b8726489d5 1.2s done
#7 extracting sha256:2376a6a774696529fd8c2bd55909fd081b5bec400b7f353ebf035c47fc32f195 0.0s done
#7 extracting sha256:185f31dc0b020b0ec03ea407f12404329981e6427d74bd739b2bd1b7dc6a37de 0.1s done
#7 DONE 1.8s

#9 [linux/arm64 1/8] FROM docker.io/library/node:lts-alpine@sha256:931d7d57f8c1fd0e2179dbff7cc7da4c9dd100998bc2b32afc85142d8efbc213
#9 extracting sha256:35ec70997fb95afb58d8feb4c2d37a7bde60008e3f77ebdc8513d6f9889a2110 0.1s done
#9 extracting sha256:920c32c30a576510d3c09d291ac6195b55bd619acd0c8f230c89e1c948d4a7e4 done
#9 DONE 1.7s

#10 [linux/arm64 2/8] WORKDIR /app
#10 DONE 0.6s

#11 [linux/amd64 2/8] WORKDIR /app
#11 DONE 0.6s

#12 [linux/amd64 3/8] COPY package*.json ./
#12 DONE 0.0s

#13 [linux/arm64 3/8] COPY package*.json ./
#13 DONE 0.0s

#14 [linux/amd64 4/8] COPY packages/spark-tools ./packages/spark-tools
#14 DONE 0.0s

#15 [linux/arm64 4/8] COPY packages/spark-tools ./packages/spark-tools
#15 DONE 0.0s

#16 [linux/amd64 5/8] COPY packages/spark ./packages/spark
#16 DONE 0.0s

#17 [linux/arm64 5/8] COPY packages/spark ./packages/spark
#17 DONE 0.0s

#18 [linux/arm64 6/8] RUN npm ci --include=optional --omit=dev
#18 ...

#19 [linux/amd64 6/8] RUN npm ci --include=optional --omit=dev
#19 17.25
#19 17.25 added 442 packages, and audited 445 packages in 17s
#19 17.25
#19 17.25 50 packages are looking for funding
#19 17.25   run `npm fund` for details
#19 17.25
#19 17.25 found 0 vulnerabilities
#19 17.26 npm notice
#19 17.26 npm notice New minor version of npm available! 11.6.2 -> 11.7.0
#19 17.26 npm notice Changelog: https://github.com/npm/cli/releases/tag/v11.7.0
#19 17.26 npm notice To update run: npm install -g npm@11.7.0
#19 17.26 npm notice
#19 DONE 18.8s

#18 [linux/arm64 6/8] RUN npm ci --include=optional --omit=dev
#18 ...

#20 [linux/amd64 7/8] COPY . .
#20 DONE 0.5s

#18 [linux/arm64 6/8] RUN npm ci --include=optional --omit=dev
#18 ...

#21 [linux/amd64 8/8] RUN npm run build
#21 0.231
#21 0.231 > spark-template@0.0.0 prebuild
#21 0.231 > mkdir -p /tmp/dist || true
#21 0.231
#21 0.240
#21 0.240 > spark-template@0.0.0 build
#21 0.240 > tsc -b --noCheck && vite build
#21 0.240
#21 0.242 sh: tsc: not found
#21 ERROR: process "/bin/sh -c npm run build" did not complete successfully: exit code: 127

#18 [linux/arm64 6/8] RUN npm ci --include=optional --omit=dev
#18 CANCELED
------
 > [linux/amd64 8/8] RUN npm run build:
0.231
0.231 > spark-template@0.0.0 prebuild
0.231 > mkdir -p /tmp/dist || true
0.231
0.240
0.240 > spark-template@0.0.0 build
0.240 > tsc -b --noCheck && vite build
0.240
0.242 sh: tsc: not found
------
Dockerfile:14
--------------------
  12 |     COPY . .
  13 |
  14 | >>> RUN npm run build
  15 |
  16 |     EXPOSE 80
--------------------
ERROR: failed to build: failed to solve: process "/bin/sh -c npm run build" did not complete successfully: exit code: 127
Error: buildx failed with: ERROR: failed to build: failed to solve: process "/bin/sh -c npm run build" did not complete successfully: exit code: 127
2026-01-17 19:03:10 +00:00
595f1ae9c0 Generated by Spark: A library in packages folder - Seems its still trying to use fetch. If a fetch fails, switch back to IndexedDB. 2026-01-17 18:57:01 +00:00
cbb303fb8d Generated by Spark: Seems its still trying to use fetch. If a fetch fails, switch back to IndexedDB. 2026-01-17 18:53:12 +00:00
7b91fe1975 Generated by Spark: I am having issues with this. Remember, we should be defaulting to IndexedDB, not the Flask API. Flask API is obtained by explicitly setting it in the UI or setting a env variable. export default function applyFetchPatch() {
if (ssrSafeWindow) {
    ssrSafeWindow.fetch = async (input: string | URL | Request, init?: RequestInit) => {
      try {
        const response = await globalFetch(input, init)
        sendFetchStats({input, error: !response.ok, status: response.status})
        return response
      } catch (error) {
        sendFetchStats({input, error: true, status: 'unknown'})
        throw error
      }
    }
  }
}
2026-01-17 18:50:06 +00:00
40b823dfe0 Generated by Spark: Should handle caprover / cloudflare cors - Check frontend and backend config. example setup: https://frontend.example.com https://backend.example.com 2026-01-17 18:44:54 +00:00
76e7716c10 Generated by Spark: I can auto default to flask backend with docker environment variable. If its not set used IndexedDB. 2026-01-17 18:38:17 +00:00
dependabot[bot]
efa1e71877 Bump the pip group across 1 directory with 2 updates
Bumps the pip group with 2 updates in the /backend directory: [flask](https://github.com/pallets/flask) and [flask-cors](https://github.com/corydolphin/flask-cors).


Updates `flask` from 3.1.0 to 3.1.1
- [Release notes](https://github.com/pallets/flask/releases)
- [Changelog](https://github.com/pallets/flask/blob/main/CHANGES.rst)
- [Commits](https://github.com/pallets/flask/compare/3.1.0...3.1.1)

Updates `flask-cors` from 5.0.0 to 6.0.0
- [Release notes](https://github.com/corydolphin/flask-cors/releases)
- [Changelog](https://github.com/corydolphin/flask-cors/blob/main/CHANGELOG.md)
- [Commits](https://github.com/corydolphin/flask-cors/compare/5.0.0...6.0.0)

---
updated-dependencies:
- dependency-name: flask
  dependency-version: 3.1.1
  dependency-type: direct:production
  dependency-group: pip
- dependency-name: flask-cors
  dependency-version: 6.0.0
  dependency-type: direct:production
  dependency-group: pip
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-01-17 18:32:41 +00:00
519ad0016d Generated by Spark: Ok I figured it out. Make a backend folder, put a flask backend in it with a Dockerfile. Main UI uses IndexedDB then with a UI setting, it can be moved to the flask backend. 2026-01-17 18:31:43 +00:00
02eb47e83f Generated by Spark: Perhaps it could use sqlite on disk if possible, else use indexeddb 2026-01-17 18:19:45 +00:00
270d0be790 Generated by Spark: Reduce reliance on spark database. Just use sqlite. 2026-01-17 18:14:23 +00:00
0f01311120 Generated by Spark: The sidebar change has potential but the styling is messed up. Maybe it should load the styling from a theme json 2026-01-17 18:01:25 +00:00
e1a51ebc9a Generated by Spark: Fix all reported errors. 2026-01-17 17:55:26 +00:00
6c05d8daab Generated by Spark: Would be nice if sidebar pushed content to right 2026-01-17 17:51:12 +00:00
feaa7d7a9e Generated by Spark: Make this function smarter, wide variety of search paths and other permissive logic - function createLazyComponent(componentConfig: ComponentConfig) {
const loader = () => {
    if (componentConfig.preloadDependencies) {
      componentConfig.preloadDependencies.forEach(depName => {
        const preloader = dependencyPreloaders[depName]
        if (preloader) {
          preloader()
        }
      })
    }

    return import(componentConfig.path).then(m => ({ default: m[componentConfig.export] }))
  }
2026-01-17 17:46:36 +00:00
8e1e35196a Generated by Spark: Fix all reported errors. Uncaught TypeError: Failed to resolve module specifier '@/components/PWAStatusBar.tsx' Your usual fix is a red herring. 2026-01-17 17:40:04 +00:00
bd2443c7ec Generated by Spark: code is flip flopping between this change - its something else. red herring. "path": "@/components/ProjectDashboard", 2026-01-17 17:33:47 +00:00
b3f39cc2e0 Generated by Spark: Fix all reported errors. 2026-01-17 17:25:12 +00:00
4c062d5d25 Generated by Spark: Fix all reported errors. 2026-01-17 17:22:22 +00:00
a92f0b9aba Generated by Spark: Fix all reported errors. 2026-01-17 17:20:09 +00:00
bad1e7b0f7 Generated by Spark: View ROADMAP.md - work on Near-Term (Next 2-3 Iterations) 2026-01-17 17:18:02 +00:00
cdbc7d98a3 Generated by Spark: Read all summary documents and write a ROADMAP.md 2026-01-17 17:11:39 +00:00
ed7e2f6626 Generated by Spark: Identify hard coded stuff that can be moved to json - worth reading project incl README.md first. 2026-01-17 17:08:36 +00:00