mirror of
https://github.com/johndoe6345789/metabuilder.git
synced 2026-04-29 08:14:57 +00:00
various changes
This commit is contained in:
298
frontends/codegen/scripts/QEMU_SCRIPTS_README.md
Normal file
298
frontends/codegen/scripts/QEMU_SCRIPTS_README.md
Normal file
@@ -0,0 +1,298 @@
|
||||
# Multi-Architecture Build Scripts
|
||||
|
||||
This directory contains helper scripts for QEMU multi-architecture Docker builds.
|
||||
|
||||
## Scripts
|
||||
|
||||
### 🚀 build-multiarch.sh
|
||||
|
||||
Builds multi-architecture Docker images locally using QEMU and Docker Buildx.
|
||||
|
||||
**Usage:**
|
||||
```bash
|
||||
# Make executable
|
||||
chmod +x scripts/build-multiarch.sh
|
||||
|
||||
# Build for local testing (loads into Docker)
|
||||
./scripts/build-multiarch.sh myapp latest
|
||||
|
||||
# Build and push to registry
|
||||
./scripts/build-multiarch.sh myapp latest "linux/amd64,linux/arm64" ghcr.io --push
|
||||
|
||||
# Custom platforms
|
||||
./scripts/build-multiarch.sh myapp v1.0 "linux/amd64,linux/arm64,linux/arm/v7" ghcr.io --push
|
||||
```
|
||||
|
||||
**Parameters:**
|
||||
1. `IMAGE_NAME` - Docker image name (default: `myapp`)
|
||||
2. `IMAGE_TAG` - Image tag (default: `latest`)
|
||||
3. `PLATFORMS` - Comma-separated platforms (default: `linux/amd64,linux/arm64`)
|
||||
4. `REGISTRY` - Container registry (default: `ghcr.io`)
|
||||
5. `--push` - Push to registry (omit to load locally)
|
||||
|
||||
**Features:**
|
||||
- ✅ Automatic QEMU setup
|
||||
- ✅ Buildx builder configuration
|
||||
- ✅ Color-coded output
|
||||
- ✅ Error handling
|
||||
- ✅ Progress indicators
|
||||
|
||||
### 🔍 validate-qemu.sh
|
||||
|
||||
Validates QEMU installation and multi-architecture build capabilities.
|
||||
|
||||
**Usage:**
|
||||
```bash
|
||||
# Make executable
|
||||
chmod +x scripts/validate-qemu.sh
|
||||
|
||||
# Run validation
|
||||
./scripts/validate-qemu.sh
|
||||
```
|
||||
|
||||
**Checks:**
|
||||
- ✅ Docker installation
|
||||
- ✅ Docker Buildx availability
|
||||
- ✅ QEMU installation
|
||||
- ✅ QEMU binaries functionality
|
||||
- ✅ Buildx builder setup
|
||||
- ✅ Platform support (AMD64, ARM64)
|
||||
- ✅ Test builds for each platform
|
||||
- ✅ CI/CD configuration validation
|
||||
|
||||
**Exit Codes:**
|
||||
- `0` - All validations passed
|
||||
- `1` - One or more validations failed
|
||||
|
||||
## Quick Start
|
||||
|
||||
### First Time Setup
|
||||
|
||||
```bash
|
||||
# 1. Make scripts executable
|
||||
chmod +x scripts/*.sh
|
||||
|
||||
# 2. Validate your environment
|
||||
./scripts/validate-qemu.sh
|
||||
|
||||
# 3. Build your first multi-arch image
|
||||
./scripts/build-multiarch.sh codeforge latest
|
||||
```
|
||||
|
||||
### Local Development
|
||||
|
||||
```bash
|
||||
# Build and load into local Docker (AMD64 only for speed)
|
||||
./scripts/build-multiarch.sh myapp dev
|
||||
|
||||
# Run the image
|
||||
docker run -p 80:80 ghcr.io/myapp:dev
|
||||
```
|
||||
|
||||
### Production Release
|
||||
|
||||
```bash
|
||||
# Build multi-arch and push to registry
|
||||
./scripts/build-multiarch.sh codeforge v1.2.3 "linux/amd64,linux/arm64" ghcr.io --push
|
||||
|
||||
# Verify the manifest
|
||||
docker manifest inspect ghcr.io/codeforge:v1.2.3
|
||||
```
|
||||
|
||||
## Supported Platforms
|
||||
|
||||
### Default Platforms
|
||||
- `linux/amd64` - Intel/AMD 64-bit (x86_64)
|
||||
- `linux/arm64` - ARM 64-bit (aarch64)
|
||||
|
||||
### Additional Platforms (Optional)
|
||||
- `linux/arm/v7` - ARM 32-bit (armv7l) - Raspberry Pi 3 and older
|
||||
- `linux/arm/v6` - ARM 32-bit (armv6l) - Raspberry Pi Zero
|
||||
- `linux/ppc64le` - IBM POWER (Little Endian)
|
||||
- `linux/s390x` - IBM Z mainframe
|
||||
- `linux/386` - Intel/AMD 32-bit (i386)
|
||||
|
||||
## CI/CD Integration
|
||||
|
||||
These scripts are reference implementations. The actual CI/CD pipelines use:
|
||||
|
||||
### GitHub Actions
|
||||
```yaml
|
||||
- name: Set up QEMU
|
||||
uses: docker/setup-qemu-action@v3
|
||||
with:
|
||||
platforms: linux/amd64,linux/arm64
|
||||
```
|
||||
|
||||
### CircleCI
|
||||
```yaml
|
||||
- run:
|
||||
name: Install QEMU
|
||||
command: docker run --rm --privileged multiarch/qemu-user-static --reset -p yes
|
||||
```
|
||||
|
||||
### GitLab CI
|
||||
```yaml
|
||||
before_script:
|
||||
- docker run --rm --privileged multiarch/qemu-user-static --reset -p yes
|
||||
- docker buildx create --name multiarch --driver docker-container --use
|
||||
```
|
||||
|
||||
### Jenkins
|
||||
```groovy
|
||||
sh 'docker run --rm --privileged multiarch/qemu-user-static --reset -p yes'
|
||||
sh 'docker buildx create --name multiarch --driver docker-container --use'
|
||||
```
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
### Permission Denied
|
||||
|
||||
```bash
|
||||
# Run with sudo
|
||||
sudo ./scripts/build-multiarch.sh myapp latest
|
||||
|
||||
# Or add your user to docker group
|
||||
sudo usermod -aG docker $USER
|
||||
newgrp docker
|
||||
```
|
||||
|
||||
### QEMU Not Found
|
||||
|
||||
```bash
|
||||
# Manually install QEMU
|
||||
docker run --rm --privileged multiarch/qemu-user-static --reset -p yes
|
||||
```
|
||||
|
||||
### Buildx Not Available
|
||||
|
||||
```bash
|
||||
# Install Docker Buildx
|
||||
docker buildx install
|
||||
|
||||
# Verify installation
|
||||
docker buildx version
|
||||
```
|
||||
|
||||
### Slow Builds
|
||||
|
||||
Cross-compilation (especially AMD64 → ARM64) is slower than native builds. This is normal.
|
||||
|
||||
**Optimization tips:**
|
||||
- Use build cache: `--cache-from type=gha --cache-to type=gha,mode=max`
|
||||
- Build single platform for development: remove `--platform` or specify one arch
|
||||
- Use native runners: GitHub Actions has ARM64 runners available
|
||||
|
||||
### Platform Not Supported
|
||||
|
||||
Some base images don't support all platforms. Check the base image documentation.
|
||||
|
||||
```bash
|
||||
# Check available platforms for an image
|
||||
docker manifest inspect alpine:latest
|
||||
```
|
||||
|
||||
## Performance Benchmarks
|
||||
|
||||
Approximate build times for a typical web application:
|
||||
|
||||
| Configuration | Time | Notes |
|
||||
|--------------|------|-------|
|
||||
| AMD64 only (native) | 5-8 min | Fastest |
|
||||
| ARM64 only (emulated) | 10-15 min | Cross-compiled on AMD64 |
|
||||
| AMD64 + ARM64 | 15-20 min | Both platforms |
|
||||
| AMD64 + ARM64 + ARMv7 | 20-30 min | Three platforms |
|
||||
|
||||
## Environment Variables
|
||||
|
||||
Scripts support these environment variables:
|
||||
|
||||
```bash
|
||||
# Docker registry credentials
|
||||
export DOCKER_USERNAME="your-username"
|
||||
export DOCKER_PASSWORD="your-token"
|
||||
|
||||
# Custom registry
|
||||
export REGISTRY="ghcr.io"
|
||||
|
||||
# Build options
|
||||
export DOCKER_BUILDKIT=1
|
||||
export BUILDKIT_PROGRESS=plain
|
||||
```
|
||||
|
||||
## Examples
|
||||
|
||||
### Example 1: Development Build
|
||||
|
||||
```bash
|
||||
# Quick local build for testing
|
||||
./scripts/build-multiarch.sh myapp dev
|
||||
|
||||
# Test the image
|
||||
docker run -p 3000:80 ghcr.io/myapp:dev
|
||||
curl http://localhost:3000
|
||||
```
|
||||
|
||||
### Example 2: Staging Release
|
||||
|
||||
```bash
|
||||
# Build and push to staging
|
||||
./scripts/build-multiarch.sh myapp staging "linux/amd64,linux/arm64" ghcr.io --push
|
||||
|
||||
# Deploy on staging server
|
||||
docker pull ghcr.io/myapp:staging
|
||||
docker run -d -p 80:80 ghcr.io/myapp:staging
|
||||
```
|
||||
|
||||
### Example 3: Production Release
|
||||
|
||||
```bash
|
||||
# Build with version tag
|
||||
./scripts/build-multiarch.sh myapp v2.1.0 "linux/amd64,linux/arm64" ghcr.io --push
|
||||
|
||||
# Also tag as latest
|
||||
docker tag ghcr.io/myapp:v2.1.0 ghcr.io/myapp:latest
|
||||
docker push ghcr.io/myapp:latest
|
||||
```
|
||||
|
||||
### Example 4: IoT/Edge Devices
|
||||
|
||||
```bash
|
||||
# Build for Raspberry Pi (ARMv7 + ARM64)
|
||||
./scripts/build-multiarch.sh iot-app v1.0 "linux/arm64,linux/arm/v7" ghcr.io --push
|
||||
|
||||
# Pull on Raspberry Pi
|
||||
docker pull ghcr.io/iot-app:v1.0
|
||||
docker run ghcr.io/iot-app:v1.0
|
||||
```
|
||||
|
||||
## Additional Resources
|
||||
|
||||
- [QEMU Integration Guide](../QEMU_INTEGRATION.md) - Full documentation
|
||||
- [CI/CD Summary](../QEMU_CI_CD_SUMMARY.md) - Implementation details
|
||||
- [Docker Buildx](https://docs.docker.com/buildx/) - Official documentation
|
||||
- [QEMU User Static](https://github.com/multiarch/qemu-user-static) - QEMU binaries
|
||||
|
||||
## Contributing
|
||||
|
||||
When adding new scripts:
|
||||
|
||||
1. Follow the existing script structure
|
||||
2. Add color-coded output for better UX
|
||||
3. Include error handling with meaningful messages
|
||||
4. Document usage and parameters
|
||||
5. Update this README
|
||||
|
||||
## Support
|
||||
|
||||
For issues with multi-architecture builds:
|
||||
|
||||
1. Run validation: `./scripts/validate-qemu.sh`
|
||||
2. Check Docker version: `docker --version` (v20.10+ recommended)
|
||||
3. Verify QEMU: `docker run --rm multiarch/qemu-user-static --version`
|
||||
4. Review logs for specific error messages
|
||||
|
||||
---
|
||||
|
||||
*Last Updated: 2024*
|
||||
*Maintained by: Development Team*
|
||||
190
frontends/codegen/scripts/analyze-duplicates.ts
Normal file
190
frontends/codegen/scripts/analyze-duplicates.ts
Normal file
@@ -0,0 +1,190 @@
|
||||
#!/usr/bin/env tsx
|
||||
/**
|
||||
* Analyze duplicate TSX files before deletion
|
||||
* Check JSON contents to ensure they're complete
|
||||
*/
|
||||
|
||||
import fs from 'fs'
|
||||
import path from 'path'
|
||||
import { globSync } from 'fs'
|
||||
|
||||
const ROOT_DIR = path.resolve(process.cwd())
|
||||
const CONFIG_PAGES_DIR = path.join(ROOT_DIR, 'src/config/pages')
|
||||
const COMPONENTS_DIR = path.join(ROOT_DIR, 'src/components')
|
||||
const JSON_DEFS_DIR = path.join(ROOT_DIR, 'src/components/json-definitions')
|
||||
|
||||
function toKebabCase(str: string): string {
|
||||
return str.replace(/([A-Z])/g, '-$1').toLowerCase().replace(/^-/, '')
|
||||
}
|
||||
|
||||
interface AnalysisResult {
|
||||
tsx: string
|
||||
json: string
|
||||
tsxSize: number
|
||||
jsonSize: number
|
||||
tsxHasHooks: boolean
|
||||
tsxHasState: boolean
|
||||
tsxHasEffects: boolean
|
||||
jsonHasBindings: boolean
|
||||
jsonHasChildren: boolean
|
||||
recommendation: 'safe-to-delete' | 'needs-review' | 'keep-tsx'
|
||||
reason: string
|
||||
}
|
||||
|
||||
async function analyzeTsxFile(filePath: string): Promise<{
|
||||
hasHooks: boolean
|
||||
hasState: boolean
|
||||
hasEffects: boolean
|
||||
}> {
|
||||
const content = fs.readFileSync(filePath, 'utf-8')
|
||||
|
||||
return {
|
||||
hasHooks: /use[A-Z]/.test(content),
|
||||
hasState: /useState|useReducer/.test(content),
|
||||
hasEffects: /useEffect|useLayoutEffect/.test(content)
|
||||
}
|
||||
}
|
||||
|
||||
async function analyzeJsonFile(filePath: string): Promise<{
|
||||
hasBindings: boolean
|
||||
hasChildren: boolean
|
||||
size: number
|
||||
}> {
|
||||
const content = fs.readFileSync(filePath, 'utf-8')
|
||||
const json = JSON.parse(content)
|
||||
|
||||
return {
|
||||
hasBindings: !!json.bindings || hasNestedBindings(json),
|
||||
hasChildren: !!json.children,
|
||||
size: content.length
|
||||
}
|
||||
}
|
||||
|
||||
function hasNestedBindings(obj: any): boolean {
|
||||
if (!obj || typeof obj !== 'object') return false
|
||||
if (obj.bindings) return true
|
||||
|
||||
for (const key in obj) {
|
||||
if (hasNestedBindings(obj[key])) return true
|
||||
}
|
||||
return false
|
||||
}
|
||||
|
||||
async function analyzeDuplicates() {
|
||||
console.log('🔍 Analyzing duplicate TSX files...\n')
|
||||
|
||||
const results: AnalysisResult[] = []
|
||||
|
||||
// Find all TSX files in atoms, molecules, organisms
|
||||
const categories = ['atoms', 'molecules', 'organisms']
|
||||
|
||||
for (const category of categories) {
|
||||
const tsxFiles = globSync(path.join(COMPONENTS_DIR, category, '*.tsx'))
|
||||
|
||||
for (const tsxFile of tsxFiles) {
|
||||
const basename = path.basename(tsxFile, '.tsx')
|
||||
const kebab = toKebabCase(basename)
|
||||
|
||||
// Check for JSON equivalent in config/pages
|
||||
const jsonPath = path.join(CONFIG_PAGES_DIR, category, `${kebab}.json`)
|
||||
|
||||
if (!fs.existsSync(jsonPath)) continue
|
||||
|
||||
// Check for JSON definition
|
||||
const jsonDefPath = path.join(JSON_DEFS_DIR, `${kebab}.json`)
|
||||
|
||||
// Analyze both files
|
||||
const tsxAnalysis = await analyzeTsxFile(tsxFile)
|
||||
const tsxSize = fs.statSync(tsxFile).size
|
||||
|
||||
let jsonAnalysis = { hasBindings: false, hasChildren: false, size: 0 }
|
||||
let actualJsonPath = jsonPath
|
||||
|
||||
if (fs.existsSync(jsonDefPath)) {
|
||||
jsonAnalysis = await analyzeJsonFile(jsonDefPath)
|
||||
actualJsonPath = jsonDefPath
|
||||
} else if (fs.existsSync(jsonPath)) {
|
||||
jsonAnalysis = await analyzeJsonFile(jsonPath)
|
||||
}
|
||||
|
||||
// Determine recommendation
|
||||
let recommendation: AnalysisResult['recommendation'] = 'safe-to-delete'
|
||||
let reason = 'JSON definition exists'
|
||||
|
||||
if (tsxAnalysis.hasState || tsxAnalysis.hasEffects) {
|
||||
if (!jsonAnalysis.hasBindings && jsonAnalysis.size < 500) {
|
||||
recommendation = 'needs-review'
|
||||
reason = 'TSX has state/effects but JSON seems incomplete'
|
||||
} else {
|
||||
recommendation = 'safe-to-delete'
|
||||
reason = 'TSX has hooks but JSON should handle via createJsonComponentWithHooks'
|
||||
}
|
||||
}
|
||||
|
||||
if (tsxSize > 5000 && jsonAnalysis.size < 1000) {
|
||||
recommendation = 'needs-review'
|
||||
reason = 'TSX is large but JSON is small - might be missing content'
|
||||
}
|
||||
|
||||
results.push({
|
||||
tsx: path.relative(ROOT_DIR, tsxFile),
|
||||
json: path.relative(ROOT_DIR, actualJsonPath),
|
||||
tsxSize,
|
||||
jsonSize: jsonAnalysis.size,
|
||||
tsxHasHooks: tsxAnalysis.hasHooks,
|
||||
tsxHasState: tsxAnalysis.hasState,
|
||||
tsxHasEffects: tsxAnalysis.hasEffects,
|
||||
jsonHasBindings: jsonAnalysis.hasBindings,
|
||||
jsonHasChildren: jsonAnalysis.hasChildren,
|
||||
recommendation,
|
||||
reason
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
// Print results
|
||||
console.log(`📊 Found ${results.length} duplicate components\n`)
|
||||
|
||||
const safeToDelete = results.filter(r => r.recommendation === 'safe-to-delete')
|
||||
const needsReview = results.filter(r => r.recommendation === 'needs-review')
|
||||
const keepTsx = results.filter(r => r.recommendation === 'keep-tsx')
|
||||
|
||||
console.log(`✅ Safe to delete: ${safeToDelete.length}`)
|
||||
console.log(`⚠️ Needs review: ${needsReview.length}`)
|
||||
console.log(`🔴 Keep TSX: ${keepTsx.length}\n`)
|
||||
|
||||
if (needsReview.length > 0) {
|
||||
console.log('⚠️ NEEDS REVIEW:')
|
||||
console.log('='.repeat(80))
|
||||
for (const result of needsReview.slice(0, 10)) {
|
||||
console.log(`\n${result.tsx}`)
|
||||
console.log(` → ${result.json}`)
|
||||
console.log(` TSX: ${result.tsxSize} bytes | JSON: ${result.jsonSize} bytes`)
|
||||
console.log(` TSX hooks: ${result.tsxHasHooks} | state: ${result.tsxHasState} | effects: ${result.tsxHasEffects}`)
|
||||
console.log(` JSON bindings: ${result.jsonHasBindings} | children: ${result.jsonHasChildren}`)
|
||||
console.log(` Reason: ${result.reason}`)
|
||||
}
|
||||
if (needsReview.length > 10) {
|
||||
console.log(`\n... and ${needsReview.length - 10} more`)
|
||||
}
|
||||
}
|
||||
|
||||
// Write full report
|
||||
const reportPath = path.join(ROOT_DIR, 'duplicate-analysis.json')
|
||||
fs.writeFileSync(reportPath, JSON.stringify(results, null, 2))
|
||||
console.log(`\n📄 Full report written to: ${reportPath}`)
|
||||
|
||||
// Generate deletion script for safe components
|
||||
if (safeToDelete.length > 0) {
|
||||
const deletionScript = safeToDelete.map(r => `rm "${r.tsx}"`).join('\n')
|
||||
const scriptPath = path.join(ROOT_DIR, 'delete-duplicates.sh')
|
||||
fs.writeFileSync(scriptPath, deletionScript)
|
||||
console.log(`📝 Deletion script written to: ${scriptPath}`)
|
||||
console.log(` Run: bash delete-duplicates.sh`)
|
||||
}
|
||||
}
|
||||
|
||||
analyzeDuplicates().catch(error => {
|
||||
console.error('❌ Analysis failed:', error)
|
||||
process.exit(1)
|
||||
})
|
||||
75
frontends/codegen/scripts/analyze-pure-json-candidates.ts
Normal file
75
frontends/codegen/scripts/analyze-pure-json-candidates.ts
Normal file
@@ -0,0 +1,75 @@
|
||||
import fs from 'node:fs/promises'
|
||||
import path from 'node:path'
|
||||
import { fileURLToPath } from 'node:url'
|
||||
|
||||
const __dirname = path.dirname(fileURLToPath(import.meta.url))
|
||||
const rootDir = path.resolve(__dirname, '..')
|
||||
|
||||
const componentsToAnalyze = {
|
||||
molecules: ['DataSourceCard', 'EditorToolbar', 'EmptyEditorState', 'MonacoEditorPanel', 'SearchBar'],
|
||||
organisms: ['EmptyCanvasState', 'PageHeader', 'SchemaEditorCanvas', 'SchemaEditorPropertiesPanel',
|
||||
'SchemaEditorSidebar', 'SchemaEditorStatusBar', 'SchemaEditorToolbar', 'ToolbarActions'],
|
||||
}
|
||||
|
||||
async function analyzeComponent(category: string, component: string): Promise<void> {
|
||||
const tsFile = path.join(rootDir, `src/components/${category}/${component}.tsx`)
|
||||
const content = await fs.readFile(tsFile, 'utf-8')
|
||||
|
||||
// Check if it's pure composition (only uses UI primitives)
|
||||
const hasBusinessLogic = /useState|useEffect|useCallback|useMemo|useReducer|useRef/.test(content)
|
||||
const hasComplexLogic = /if\s*\(.*\{|switch\s*\(|for\s*\(|while\s*\(/.test(content)
|
||||
|
||||
// Extract what it imports
|
||||
const imports = content.match(/import\s+\{[^}]+\}\s+from\s+['"][^'"]+['"]/g) || []
|
||||
const importedComponents = imports.flatMap(imp => {
|
||||
const match = imp.match(/\{([^}]+)\}/)
|
||||
return match ? match[1].split(',').map(s => s.trim()) : []
|
||||
})
|
||||
|
||||
// Check if it only imports from ui/atoms (pure composition)
|
||||
const onlyUIPrimitives = imports.every(imp =>
|
||||
imp.includes('@/components/ui/') ||
|
||||
imp.includes('@/components/atoms/') ||
|
||||
imp.includes('@/lib/utils') ||
|
||||
imp.includes('lucide-react') ||
|
||||
imp.includes('@phosphor-icons')
|
||||
)
|
||||
|
||||
const lineCount = content.split('\n').length
|
||||
|
||||
console.log(`\n📄 ${component}`)
|
||||
console.log(` Lines: ${lineCount}`)
|
||||
console.log(` Has hooks: ${hasBusinessLogic ? '❌' : '✅'}`)
|
||||
console.log(` Has complex logic: ${hasComplexLogic ? '❌' : '✅'}`)
|
||||
console.log(` Only UI primitives: ${onlyUIPrimitives ? '✅' : '❌'}`)
|
||||
console.log(` Imports: ${importedComponents.slice(0, 5).join(', ')}${importedComponents.length > 5 ? '...' : ''}`)
|
||||
|
||||
if (!hasBusinessLogic && onlyUIPrimitives && lineCount < 100) {
|
||||
console.log(` 🎯 CANDIDATE FOR PURE JSON`)
|
||||
}
|
||||
}
|
||||
|
||||
async function main() {
|
||||
console.log('🔍 Analyzing components for pure JSON conversion...\n')
|
||||
console.log('Looking for components that:')
|
||||
console.log(' - No hooks (useState, useEffect, etc.)')
|
||||
console.log(' - No complex logic')
|
||||
console.log(' - Only import UI primitives')
|
||||
console.log(' - Are simple compositions\n')
|
||||
|
||||
for (const [category, components] of Object.entries(componentsToAnalyze)) {
|
||||
console.log(`\n═══ ${category.toUpperCase()} ═══`)
|
||||
for (const component of components) {
|
||||
try {
|
||||
await analyzeComponent(category, component)
|
||||
} catch (e) {
|
||||
console.log(`\n📄 ${component}`)
|
||||
console.log(` ⚠️ Could not analyze: ${e}`)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
console.log('\n\n✨ Analysis complete!')
|
||||
}
|
||||
|
||||
main().catch(console.error)
|
||||
302
frontends/codegen/scripts/audit-json-components.ts
Normal file
302
frontends/codegen/scripts/audit-json-components.ts
Normal file
@@ -0,0 +1,302 @@
|
||||
#!/usr/bin/env tsx
|
||||
/**
|
||||
* Audit script for JSON component definitions
|
||||
*
|
||||
* Goals:
|
||||
* 1. Phase out src/components TSX files
|
||||
* 2. Audit existing JSON definitions for completeness and correctness
|
||||
*/
|
||||
|
||||
import fs from 'fs'
|
||||
import path from 'path'
|
||||
import { globSync } from 'fs'
|
||||
|
||||
interface AuditIssue {
|
||||
severity: 'error' | 'warning' | 'info'
|
||||
category: string
|
||||
file?: string
|
||||
message: string
|
||||
suggestion?: string
|
||||
}
|
||||
|
||||
interface AuditReport {
|
||||
timestamp: string
|
||||
issues: AuditIssue[]
|
||||
stats: {
|
||||
totalJsonFiles: number
|
||||
totalTsxFiles: number
|
||||
registryEntries: number
|
||||
orphanedJson: number
|
||||
duplicates: number
|
||||
obsoleteWrapperRefs: number
|
||||
}
|
||||
}
|
||||
|
||||
const ROOT_DIR = path.resolve(process.cwd())
|
||||
const CONFIG_PAGES_DIR = path.join(ROOT_DIR, 'src/config/pages')
|
||||
const COMPONENTS_DIR = path.join(ROOT_DIR, 'src/components')
|
||||
const JSON_DEFS_DIR = path.join(ROOT_DIR, 'src/components/json-definitions')
|
||||
const REGISTRY_FILE = path.join(ROOT_DIR, 'json-components-registry.json')
|
||||
|
||||
async function loadRegistry(): Promise<any> {
|
||||
const content = fs.readFileSync(REGISTRY_FILE, 'utf-8')
|
||||
return JSON.parse(content)
|
||||
}
|
||||
|
||||
function findAllFiles(pattern: string, cwd: string = ROOT_DIR): string[] {
|
||||
const fullPattern = path.join(cwd, pattern)
|
||||
return globSync(fullPattern, { ignore: '**/node_modules/**' })
|
||||
}
|
||||
|
||||
function toKebabCase(str: string): string {
|
||||
return str.replace(/([A-Z])/g, '-$1').toLowerCase().replace(/^-/, '')
|
||||
}
|
||||
|
||||
function toPascalCase(str: string): string {
|
||||
return str
|
||||
.split('-')
|
||||
.map(word => word.charAt(0).toUpperCase() + word.slice(1))
|
||||
.join('')
|
||||
}
|
||||
|
||||
async function auditJsonComponents(): Promise<AuditReport> {
|
||||
const issues: AuditIssue[] = []
|
||||
const registry = await loadRegistry()
|
||||
|
||||
// Find all files
|
||||
const jsonFiles = findAllFiles('src/config/pages/**/*.json')
|
||||
const tsxFiles = findAllFiles('src/components/**/*.tsx')
|
||||
const jsonDefFiles = findAllFiles('src/components/json-definitions/*.json')
|
||||
|
||||
console.log(`📊 Found ${jsonFiles.length} JSON files in config/pages`)
|
||||
console.log(`📊 Found ${tsxFiles.length} TSX files in src/components`)
|
||||
console.log(`📊 Found ${jsonDefFiles.length} JSON definitions`)
|
||||
console.log(`📊 Found ${registry.components?.length || 0} registry entries\n`)
|
||||
|
||||
// Build registry lookup maps
|
||||
const registryByType = new Map<string, any>()
|
||||
const registryByName = new Map<string, any>()
|
||||
|
||||
if (registry.components) {
|
||||
for (const component of registry.components) {
|
||||
if (component.type) registryByType.set(component.type, component)
|
||||
if (component.name) registryByName.set(component.name, component)
|
||||
}
|
||||
}
|
||||
|
||||
// Check 1: Find TSX files that have JSON equivalents in config/pages
|
||||
console.log('🔍 Checking for TSX files that could be replaced with JSON...')
|
||||
const tsxBasenames = new Set<string>()
|
||||
for (const tsxFile of tsxFiles) {
|
||||
const basename = path.basename(tsxFile, '.tsx')
|
||||
const dir = path.dirname(tsxFile)
|
||||
const category = path.basename(dir) // atoms, molecules, organisms
|
||||
|
||||
if (!['atoms', 'molecules', 'organisms'].includes(category)) continue
|
||||
|
||||
tsxBasenames.add(basename)
|
||||
const kebab = toKebabCase(basename)
|
||||
|
||||
// Check if there's a corresponding JSON file in config/pages
|
||||
const possibleJsonPath = path.join(CONFIG_PAGES_DIR, category, `${kebab}.json`)
|
||||
|
||||
if (fs.existsSync(possibleJsonPath)) {
|
||||
issues.push({
|
||||
severity: 'warning',
|
||||
category: 'duplicate-implementation',
|
||||
file: tsxFile,
|
||||
message: `TSX file has JSON equivalent at ${path.relative(ROOT_DIR, possibleJsonPath)}`,
|
||||
suggestion: `Consider removing TSX and routing through JSON renderer`
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
// Check 2: Find JSON files without registry entries
|
||||
console.log('🔍 Checking for orphaned JSON files...')
|
||||
for (const jsonFile of jsonFiles) {
|
||||
const content = JSON.parse(fs.readFileSync(jsonFile, 'utf-8'))
|
||||
const componentType = content.type
|
||||
|
||||
if (componentType && !registryByType.has(componentType)) {
|
||||
issues.push({
|
||||
severity: 'error',
|
||||
category: 'orphaned-json',
|
||||
file: jsonFile,
|
||||
message: `JSON file references type "${componentType}" which is not in registry`,
|
||||
suggestion: `Add registry entry for ${componentType} in json-components-registry.json`
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
// Check 3: Find components with obsolete wrapper references
|
||||
console.log('🔍 Checking for obsolete wrapper references...')
|
||||
for (const component of registry.components || []) {
|
||||
if (component.wrapperRequired || component.wrapperComponent) {
|
||||
issues.push({
|
||||
severity: 'warning',
|
||||
category: 'obsolete-wrapper-ref',
|
||||
file: `registry: ${component.type}`,
|
||||
message: `Component "${component.type}" has obsolete wrapperRequired/wrapperComponent fields`,
|
||||
suggestion: `Remove wrapperRequired and wrapperComponent fields - use createJsonComponentWithHooks instead`
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
// Check 4: Find components with load.path that don't exist
|
||||
console.log('🔍 Checking for broken load paths...')
|
||||
for (const component of registry.components || []) {
|
||||
if (component.load?.path) {
|
||||
const loadPath = component.load.path.replace('@/', 'src/')
|
||||
const possibleExtensions = ['.tsx', '.ts', '.jsx', '.js']
|
||||
let found = false
|
||||
|
||||
for (const ext of possibleExtensions) {
|
||||
if (fs.existsSync(path.join(ROOT_DIR, loadPath + ext))) {
|
||||
found = true
|
||||
break
|
||||
}
|
||||
}
|
||||
|
||||
if (!found) {
|
||||
issues.push({
|
||||
severity: 'error',
|
||||
category: 'broken-load-path',
|
||||
file: `registry: ${component.type}`,
|
||||
message: `Component "${component.type}" has load.path "${component.load.path}" but file not found`,
|
||||
suggestion: `Fix or remove load.path in registry`
|
||||
})
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Check 5: Components in src/components/molecules without JSON definitions
|
||||
console.log('🔍 Checking molecules without JSON definitions...')
|
||||
const moleculeTsxFiles = tsxFiles.filter(f => f.includes('/molecules/'))
|
||||
const jsonDefBasenames = new Set(
|
||||
jsonDefFiles.map(f => path.basename(f, '.json'))
|
||||
)
|
||||
|
||||
for (const tsxFile of moleculeTsxFiles) {
|
||||
const basename = path.basename(tsxFile, '.tsx')
|
||||
const kebab = toKebabCase(basename)
|
||||
|
||||
if (!jsonDefBasenames.has(kebab) && registryByType.has(basename)) {
|
||||
const entry = registryByType.get(basename)
|
||||
if (entry.source === 'molecules' && !entry.load?.path) {
|
||||
issues.push({
|
||||
severity: 'info',
|
||||
category: 'potential-conversion',
|
||||
file: tsxFile,
|
||||
message: `Molecule "${basename}" could potentially be converted to JSON`,
|
||||
suggestion: `Evaluate if ${basename} can be expressed as pure JSON`
|
||||
})
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
const stats = {
|
||||
totalJsonFiles: jsonFiles.length,
|
||||
totalTsxFiles: tsxFiles.length,
|
||||
registryEntries: registry.components?.length || 0,
|
||||
orphanedJson: issues.filter(i => i.category === 'orphaned-json').length,
|
||||
duplicates: issues.filter(i => i.category === 'duplicate-implementation').length,
|
||||
obsoleteWrapperRefs: issues.filter(i => i.category === 'obsolete-wrapper-ref').length
|
||||
}
|
||||
|
||||
return {
|
||||
timestamp: new Date().toISOString(),
|
||||
issues,
|
||||
stats
|
||||
}
|
||||
}
|
||||
|
||||
function printReport(report: AuditReport) {
|
||||
console.log('\n' + '='.repeat(80))
|
||||
console.log('📋 AUDIT REPORT')
|
||||
console.log('='.repeat(80))
|
||||
console.log(`\n📅 Generated: ${report.timestamp}\n`)
|
||||
|
||||
console.log('📈 Statistics:')
|
||||
console.log(` • Total JSON files: ${report.stats.totalJsonFiles}`)
|
||||
console.log(` • Total TSX files: ${report.stats.totalTsxFiles}`)
|
||||
console.log(` • Registry entries: ${report.stats.registryEntries}`)
|
||||
console.log(` • Orphaned JSON: ${report.stats.orphanedJson}`)
|
||||
console.log(` • Obsolete wrapper refs: ${report.stats.obsoleteWrapperRefs}`)
|
||||
console.log(` • Duplicate implementations: ${report.stats.duplicates}\n`)
|
||||
|
||||
// Group issues by category
|
||||
const byCategory = new Map<string, AuditIssue[]>()
|
||||
for (const issue of report.issues) {
|
||||
if (!byCategory.has(issue.category)) {
|
||||
byCategory.set(issue.category, [])
|
||||
}
|
||||
byCategory.get(issue.category)!.push(issue)
|
||||
}
|
||||
|
||||
// Print issues by severity
|
||||
const severityOrder = ['error', 'warning', 'info'] as const
|
||||
const severityIcons = { error: '❌', warning: '⚠️', info: 'ℹ️' }
|
||||
|
||||
for (const severity of severityOrder) {
|
||||
const issuesOfSeverity = report.issues.filter(i => i.severity === severity)
|
||||
if (issuesOfSeverity.length === 0) continue
|
||||
|
||||
console.log(`\n${severityIcons[severity]} ${severity.toUpperCase()} (${issuesOfSeverity.length})`)
|
||||
console.log('-'.repeat(80))
|
||||
|
||||
const categories = new Map<string, AuditIssue[]>()
|
||||
for (const issue of issuesOfSeverity) {
|
||||
if (!categories.has(issue.category)) {
|
||||
categories.set(issue.category, [])
|
||||
}
|
||||
categories.get(issue.category)!.push(issue)
|
||||
}
|
||||
|
||||
for (const [category, issues] of categories) {
|
||||
console.log(`\n ${category.replace(/-/g, ' ').toUpperCase()} (${issues.length}):`)
|
||||
|
||||
for (const issue of issues.slice(0, 5)) { // Show first 5 of each category
|
||||
console.log(` • ${issue.file || 'N/A'}`)
|
||||
console.log(` ${issue.message}`)
|
||||
if (issue.suggestion) {
|
||||
console.log(` 💡 ${issue.suggestion}`)
|
||||
}
|
||||
}
|
||||
|
||||
if (issues.length > 5) {
|
||||
console.log(` ... and ${issues.length - 5} more`)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
console.log('\n' + '='.repeat(80))
|
||||
console.log(`Total issues found: ${report.issues.length}`)
|
||||
console.log('='.repeat(80) + '\n')
|
||||
}
|
||||
|
||||
async function main() {
|
||||
console.log('🔍 Starting JSON component audit...\n')
|
||||
|
||||
const report = await auditJsonComponents()
|
||||
|
||||
printReport(report)
|
||||
|
||||
// Write report to file
|
||||
const reportPath = path.join(ROOT_DIR, 'audit-report.json')
|
||||
fs.writeFileSync(reportPath, JSON.stringify(report, null, 2))
|
||||
console.log(`📄 Full report written to: ${reportPath}\n`)
|
||||
|
||||
// Exit with error code if there are errors
|
||||
const errorCount = report.issues.filter(i => i.severity === 'error').length
|
||||
if (errorCount > 0) {
|
||||
console.log(`❌ Audit failed with ${errorCount} errors`)
|
||||
process.exit(1)
|
||||
} else {
|
||||
console.log('✅ Audit completed successfully')
|
||||
}
|
||||
}
|
||||
|
||||
main().catch(error => {
|
||||
console.error('❌ Audit failed:', error)
|
||||
process.exit(1)
|
||||
})
|
||||
130
frontends/codegen/scripts/build-multiarch.sh
Normal file
130
frontends/codegen/scripts/build-multiarch.sh
Normal file
@@ -0,0 +1,130 @@
|
||||
#!/bin/bash
|
||||
|
||||
# Multi-Architecture Docker Build Script
|
||||
# This script demonstrates how to build multi-arch images locally with QEMU
|
||||
|
||||
set -e
|
||||
|
||||
echo "🚀 Multi-Architecture Docker Build with QEMU"
|
||||
echo "=============================================="
|
||||
echo ""
|
||||
|
||||
# Colors for output
|
||||
RED='\033[0;31m'
|
||||
GREEN='\033[0;32m'
|
||||
YELLOW='\033[1;33m'
|
||||
NC='\033[0m' # No Color
|
||||
|
||||
# Configuration
|
||||
IMAGE_NAME="${1:-myapp}"
|
||||
IMAGE_TAG="${2:-latest}"
|
||||
PLATFORMS="${3:-linux/amd64,linux/arm64}"
|
||||
REGISTRY="${4:-ghcr.io}"
|
||||
|
||||
echo "📋 Configuration:"
|
||||
echo " Image Name: $IMAGE_NAME"
|
||||
echo " Image Tag: $IMAGE_TAG"
|
||||
echo " Platforms: $PLATFORMS"
|
||||
echo " Registry: $REGISTRY"
|
||||
echo ""
|
||||
|
||||
# Check if Docker is installed
|
||||
if ! command -v docker &> /dev/null; then
|
||||
echo -e "${RED}❌ Docker is not installed${NC}"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
echo -e "${GREEN}✅ Docker is installed${NC}"
|
||||
|
||||
# Check if Docker Buildx is available
|
||||
if ! docker buildx version &> /dev/null; then
|
||||
echo -e "${RED}❌ Docker Buildx is not available${NC}"
|
||||
echo "Installing Docker Buildx..."
|
||||
docker buildx install
|
||||
fi
|
||||
|
||||
echo -e "${GREEN}✅ Docker Buildx is available${NC}"
|
||||
|
||||
# Set up QEMU
|
||||
echo ""
|
||||
echo "🔧 Setting up QEMU for multi-architecture builds..."
|
||||
docker run --rm --privileged multiarch/qemu-user-static --reset -p yes
|
||||
|
||||
if [ $? -eq 0 ]; then
|
||||
echo -e "${GREEN}✅ QEMU setup successful${NC}"
|
||||
else
|
||||
echo -e "${RED}❌ QEMU setup failed${NC}"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Create or use existing buildx builder
|
||||
echo ""
|
||||
echo "🔧 Setting up Docker Buildx builder..."
|
||||
if docker buildx inspect multiarch &> /dev/null; then
|
||||
echo -e "${YELLOW}⚠️ Builder 'multiarch' already exists, using existing${NC}"
|
||||
docker buildx use multiarch
|
||||
else
|
||||
docker buildx create --name multiarch --driver docker-container --use
|
||||
echo -e "${GREEN}✅ Created new builder 'multiarch'${NC}"
|
||||
fi
|
||||
|
||||
docker buildx inspect --bootstrap
|
||||
|
||||
# Build the multi-architecture image
|
||||
echo ""
|
||||
echo "🏗️ Building multi-architecture Docker image..."
|
||||
echo " This may take several minutes..."
|
||||
echo ""
|
||||
|
||||
BUILD_ARGS=""
|
||||
if [ "$5" = "--push" ]; then
|
||||
BUILD_ARGS="--push"
|
||||
echo " Will push to registry after build"
|
||||
else
|
||||
BUILD_ARGS="--load"
|
||||
echo " Will load into local Docker daemon (single platform)"
|
||||
# When loading, we can only build for one platform
|
||||
PLATFORMS="linux/amd64"
|
||||
echo -e "${YELLOW}⚠️ Loading locally, building only for linux/amd64${NC}"
|
||||
fi
|
||||
|
||||
docker buildx build \
|
||||
--platform $PLATFORMS \
|
||||
--tag $REGISTRY/$IMAGE_NAME:$IMAGE_TAG \
|
||||
$BUILD_ARGS \
|
||||
.
|
||||
|
||||
if [ $? -eq 0 ]; then
|
||||
echo ""
|
||||
echo -e "${GREEN}✅ Build successful!${NC}"
|
||||
echo ""
|
||||
echo "📦 Built images:"
|
||||
echo " $REGISTRY/$IMAGE_NAME:$IMAGE_TAG"
|
||||
echo " Platforms: $PLATFORMS"
|
||||
|
||||
if [ "$5" = "--push" ]; then
|
||||
echo ""
|
||||
echo "🎉 Images pushed to registry!"
|
||||
echo ""
|
||||
echo "To pull the image:"
|
||||
echo " docker pull $REGISTRY/$IMAGE_NAME:$IMAGE_TAG"
|
||||
else
|
||||
echo ""
|
||||
echo "🎉 Image loaded into local Docker!"
|
||||
echo ""
|
||||
echo "To run the image:"
|
||||
echo " docker run -p 80:80 $REGISTRY/$IMAGE_NAME:$IMAGE_TAG"
|
||||
fi
|
||||
|
||||
echo ""
|
||||
echo "To inspect the manifest:"
|
||||
echo " docker manifest inspect $REGISTRY/$IMAGE_NAME:$IMAGE_TAG"
|
||||
else
|
||||
echo ""
|
||||
echo -e "${RED}❌ Build failed${NC}"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
echo ""
|
||||
echo "=============================================="
|
||||
echo "✨ Build process complete!"
|
||||
64
frontends/codegen/scripts/cleanup-registry.ts
Normal file
64
frontends/codegen/scripts/cleanup-registry.ts
Normal file
@@ -0,0 +1,64 @@
|
||||
#!/usr/bin/env tsx
|
||||
/**
|
||||
* Cleanup script to remove obsolete wrapper references from registry
|
||||
*/
|
||||
|
||||
import fs from 'fs'
|
||||
import path from 'path'
|
||||
|
||||
const REGISTRY_FILE = path.resolve(process.cwd(), 'json-components-registry.json')
|
||||
|
||||
async function cleanupRegistry() {
|
||||
console.log('🧹 Cleaning up registry...\n')
|
||||
|
||||
// Read registry
|
||||
const content = fs.readFileSync(REGISTRY_FILE, 'utf-8')
|
||||
const registry = JSON.parse(content)
|
||||
|
||||
let cleanedCount = 0
|
||||
const cleanedComponents: string[] = []
|
||||
|
||||
// Remove obsolete fields from all components
|
||||
if (registry.components) {
|
||||
for (const component of registry.components) {
|
||||
let modified = false
|
||||
|
||||
if (component.wrapperRequired !== undefined) {
|
||||
delete component.wrapperRequired
|
||||
modified = true
|
||||
}
|
||||
|
||||
if (component.wrapperComponent !== undefined) {
|
||||
delete component.wrapperComponent
|
||||
modified = true
|
||||
}
|
||||
|
||||
if (modified) {
|
||||
cleanedCount++
|
||||
cleanedComponents.push(component.type || component.name || 'Unknown')
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Write back to file with proper formatting
|
||||
fs.writeFileSync(REGISTRY_FILE, JSON.stringify(registry, null, 2) + '\n')
|
||||
|
||||
console.log(`✅ Cleaned ${cleanedCount} components\n`)
|
||||
|
||||
if (cleanedComponents.length > 0) {
|
||||
console.log('📋 Cleaned components:')
|
||||
cleanedComponents.slice(0, 10).forEach(name => {
|
||||
console.log(` • ${name}`)
|
||||
})
|
||||
if (cleanedComponents.length > 10) {
|
||||
console.log(` ... and ${cleanedComponents.length - 10} more`)
|
||||
}
|
||||
}
|
||||
|
||||
console.log('\n✨ Registry cleanup complete!')
|
||||
}
|
||||
|
||||
cleanupRegistry().catch(error => {
|
||||
console.error('❌ Cleanup failed:', error)
|
||||
process.exit(1)
|
||||
})
|
||||
115
frontends/codegen/scripts/cleanup-simple-components.ts
Normal file
115
frontends/codegen/scripts/cleanup-simple-components.ts
Normal file
@@ -0,0 +1,115 @@
|
||||
import fs from 'node:fs/promises'
|
||||
import path from 'node:path'
|
||||
import { fileURLToPath } from 'node:url'
|
||||
|
||||
const __dirname = path.dirname(fileURLToPath(import.meta.url))
|
||||
const rootDir = path.resolve(__dirname, '..')
|
||||
|
||||
/**
|
||||
* List of simple presentational components that can be safely deleted
|
||||
* These were identified by the conversion script as having no hooks or complex logic
|
||||
*/
|
||||
const SIMPLE_COMPONENTS = {
|
||||
atoms: [
|
||||
'ActionIcon', 'Alert', 'AppLogo', 'Avatar', 'Breadcrumb', 'ButtonGroup',
|
||||
'Chip', 'Code', 'ColorSwatch', 'Container', 'DataList', 'Divider', 'Dot',
|
||||
'EmptyStateIcon', 'FileIcon', 'Flex', 'Grid', 'Heading', 'HelperText',
|
||||
'IconText', 'IconWrapper', 'InfoBox', 'InfoPanel', 'Input', 'Kbd',
|
||||
'KeyValue', 'Label', 'Link', 'List', 'ListItem', 'LiveIndicator',
|
||||
'LoadingSpinner', 'LoadingState', 'MetricDisplay', 'PageHeader', 'Pulse',
|
||||
'ResponsiveGrid', 'ScrollArea', 'SearchInput', 'Section', 'Skeleton',
|
||||
'Spacer', 'Sparkle', 'Spinner', 'StatusIcon', 'TabIcon', 'Tag', 'Text',
|
||||
'TextArea', 'TextGradient', 'TextHighlight', 'Timestamp', 'TreeIcon',
|
||||
// Additional simple ones
|
||||
'AvatarGroup', 'Checkbox', 'Drawer', 'Modal', 'Notification', 'ProgressBar',
|
||||
'Radio', 'Rating', 'Select', 'Slider', 'Stack', 'StepIndicator', 'Stepper',
|
||||
'Table', 'Tabs', 'Timeline', 'Toggle',
|
||||
],
|
||||
molecules: [
|
||||
'ActionBar', 'AppBranding', 'DataCard', 'DataSourceCard', 'EditorActions',
|
||||
'EditorToolbar', 'EmptyEditorState', 'EmptyState', 'FileTabs', 'LabelWithBadge',
|
||||
'LazyInlineMonacoEditor', 'LazyMonacoEditor', 'LoadingFallback', 'LoadingState',
|
||||
'MonacoEditorPanel', 'NavigationItem', 'PageHeaderContent', 'SearchBar',
|
||||
'StatCard', 'TreeCard', 'TreeListHeader',
|
||||
],
|
||||
organisms: [
|
||||
'EmptyCanvasState', 'PageHeader', 'SchemaEditorCanvas', 'SchemaEditorPropertiesPanel',
|
||||
'SchemaEditorSidebar', 'SchemaEditorStatusBar', 'SchemaEditorToolbar', 'ToolbarActions',
|
||||
],
|
||||
ui: [
|
||||
'aspect-ratio', 'avatar', 'badge', 'checkbox', 'collapsible', 'hover-card',
|
||||
'input', 'label', 'popover', 'progress', 'radio-group', 'resizable',
|
||||
'scroll-area', 'separator', 'skeleton', 'switch', 'textarea', 'toggle',
|
||||
// Additional ones
|
||||
'accordion', 'alert', 'button', 'card', 'tabs', 'tooltip',
|
||||
],
|
||||
}
|
||||
|
||||
interface DeletionResult {
|
||||
deleted: string[]
|
||||
kept: string[]
|
||||
failed: string[]
|
||||
}
|
||||
|
||||
/**
|
||||
* Delete simple TypeScript components
|
||||
*/
|
||||
async function deleteSimpleComponents(): Promise<void> {
|
||||
console.log('🧹 Cleaning up simple TypeScript components...\n')
|
||||
|
||||
const results: DeletionResult = {
|
||||
deleted: [],
|
||||
kept: [],
|
||||
failed: [],
|
||||
}
|
||||
|
||||
// Process each category
|
||||
for (const [category, components] of Object.entries(SIMPLE_COMPONENTS)) {
|
||||
console.log(`📂 Processing ${category}...`)
|
||||
|
||||
const baseDir = path.join(rootDir, `src/components/${category}`)
|
||||
|
||||
for (const component of components) {
|
||||
const fileName = component.endsWith('.tsx') ? component : `${component}.tsx`
|
||||
const filePath = path.join(baseDir, fileName)
|
||||
|
||||
try {
|
||||
await fs.access(filePath)
|
||||
await fs.unlink(filePath)
|
||||
results.deleted.push(`${category}/${fileName}`)
|
||||
console.log(` ✅ Deleted: ${fileName}`)
|
||||
} catch (error: unknown) {
|
||||
// File doesn't exist or couldn't be deleted
|
||||
if (error instanceof Error && 'code' in error && error.code === 'ENOENT') {
|
||||
results.kept.push(`${category}/${fileName}`)
|
||||
console.log(` ⏭️ Skipped: ${fileName} (not found)`)
|
||||
} else {
|
||||
results.failed.push(`${category}/${fileName}`)
|
||||
console.log(` ❌ Failed: ${fileName}`)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
console.log()
|
||||
}
|
||||
|
||||
// Summary
|
||||
console.log('📊 Summary:')
|
||||
console.log(` Deleted: ${results.deleted.length} files`)
|
||||
console.log(` Skipped: ${results.kept.length} files`)
|
||||
console.log(` Failed: ${results.failed.length} files`)
|
||||
|
||||
if (results.failed.length > 0) {
|
||||
console.log('\n❌ Failed deletions:')
|
||||
results.failed.forEach(f => console.log(` - ${f}`))
|
||||
}
|
||||
|
||||
console.log('\n✨ Cleanup complete!')
|
||||
console.log('\n📝 Next steps:')
|
||||
console.log(' 1. Update index.ts files to remove deleted exports')
|
||||
console.log(' 2. Search for direct imports of deleted components')
|
||||
console.log(' 3. Run build to check for errors')
|
||||
console.log(' 4. Run tests to verify functionality')
|
||||
}
|
||||
|
||||
deleteSimpleComponents().catch(console.error)
|
||||
262
frontends/codegen/scripts/convert-tsx-to-json.ts
Normal file
262
frontends/codegen/scripts/convert-tsx-to-json.ts
Normal file
@@ -0,0 +1,262 @@
|
||||
import fs from 'node:fs/promises'
|
||||
import path from 'node:path'
|
||||
import { fileURLToPath } from 'node:url'
|
||||
|
||||
const __dirname = path.dirname(fileURLToPath(import.meta.url))
|
||||
const rootDir = path.resolve(__dirname, '..')
|
||||
|
||||
interface ConversionConfig {
|
||||
sourceDir: string
|
||||
targetDir: string
|
||||
category: 'atoms' | 'molecules' | 'organisms' | 'ui'
|
||||
}
|
||||
|
||||
interface ComponentAnalysis {
|
||||
name: string
|
||||
hasHooks: boolean
|
||||
hasComplexLogic: boolean
|
||||
wrapsUIComponent: boolean
|
||||
uiComponentName?: string
|
||||
defaultProps: Record<string, unknown>
|
||||
isSimplePresentational: boolean
|
||||
}
|
||||
|
||||
/**
|
||||
* Analyze a TypeScript component file to determine conversion strategy
|
||||
*/
|
||||
async function analyzeComponent(filePath: string): Promise<ComponentAnalysis> {
|
||||
const content = await fs.readFile(filePath, 'utf-8')
|
||||
const fileName = path.basename(filePath, '.tsx')
|
||||
|
||||
// Check for hooks
|
||||
const hasHooks = /use[A-Z]\w+\(/.test(content) ||
|
||||
/useState|useEffect|useCallback|useMemo|useRef|useReducer/.test(content)
|
||||
|
||||
// Check for complex logic
|
||||
const hasComplexLogic = hasHooks ||
|
||||
/switch\s*\(/.test(content) ||
|
||||
/for\s*\(/.test(content) ||
|
||||
/while\s*\(/.test(content) ||
|
||||
content.split('\n').length > 100
|
||||
|
||||
// Check if it wraps a shadcn/ui component
|
||||
const uiImportMatch = content.match(/import\s+\{([^}]+)\}\s+from\s+['"]@\/components\/ui\//)
|
||||
const wrapsUIComponent = !!uiImportMatch
|
||||
const uiComponentName = wrapsUIComponent ? uiImportMatch?.[1].trim() : undefined
|
||||
|
||||
// Extract default props from interface
|
||||
const defaultProps: Record<string, unknown> = {}
|
||||
const propDefaults = content.matchAll(/(\w+)\s*[?]?\s*:\s*([^=\n]+)\s*=\s*['"]?([^'";\n,}]+)['"]?/g)
|
||||
for (const match of propDefaults) {
|
||||
const [, propName, , defaultValue] = match
|
||||
if (propName && defaultValue) {
|
||||
defaultProps[propName] = defaultValue.replace(/['"]/g, '')
|
||||
}
|
||||
}
|
||||
|
||||
// Determine if it's simple presentational
|
||||
const isSimplePresentational = !hasComplexLogic &&
|
||||
!hasHooks &&
|
||||
content.split('\n').length < 60
|
||||
|
||||
return {
|
||||
name: fileName,
|
||||
hasHooks,
|
||||
hasComplexLogic,
|
||||
wrapsUIComponent,
|
||||
uiComponentName,
|
||||
defaultProps,
|
||||
isSimplePresentational,
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Generate JSON definition for a component based on analysis
|
||||
*/
|
||||
function generateJSON(analysis: ComponentAnalysis, category: string): object {
|
||||
// If it wraps a UI component, reference that
|
||||
if (analysis.wrapsUIComponent && analysis.uiComponentName) {
|
||||
return {
|
||||
type: analysis.uiComponentName,
|
||||
props: analysis.defaultProps,
|
||||
}
|
||||
}
|
||||
|
||||
// If it's simple presentational, create a basic structure
|
||||
if (analysis.isSimplePresentational) {
|
||||
return {
|
||||
type: analysis.name,
|
||||
props: analysis.defaultProps,
|
||||
}
|
||||
}
|
||||
|
||||
// If it has hooks or complex logic, mark as needing wrapper
|
||||
if (analysis.hasHooks || analysis.hasComplexLogic) {
|
||||
return {
|
||||
type: analysis.name,
|
||||
jsonCompatible: false,
|
||||
wrapperRequired: true,
|
||||
load: {
|
||||
path: `@/components/${category}/${analysis.name}`,
|
||||
export: analysis.name,
|
||||
},
|
||||
props: analysis.defaultProps,
|
||||
metadata: {
|
||||
notes: analysis.hasHooks ? 'Contains hooks - needs wrapper' : 'Complex logic - needs wrapper',
|
||||
},
|
||||
}
|
||||
}
|
||||
|
||||
// Default case
|
||||
return {
|
||||
type: analysis.name,
|
||||
props: analysis.defaultProps,
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Convert a single TypeScript file to JSON
|
||||
*/
|
||||
async function convertFile(
|
||||
sourceFile: string,
|
||||
targetDir: string,
|
||||
category: string
|
||||
): Promise<{ success: boolean; analysis: ComponentAnalysis }> {
|
||||
try {
|
||||
const analysis = await analyzeComponent(sourceFile)
|
||||
const json = generateJSON(analysis, category)
|
||||
|
||||
// Generate kebab-case filename
|
||||
const jsonFileName = analysis.name
|
||||
.replace(/([A-Z])/g, '-$1')
|
||||
.toLowerCase()
|
||||
.replace(/^-/, '') + '.json'
|
||||
|
||||
const targetFile = path.join(targetDir, jsonFileName)
|
||||
|
||||
await fs.writeFile(targetFile, JSON.stringify(json, null, 2) + '\n')
|
||||
|
||||
return { success: true, analysis }
|
||||
} catch (error) {
|
||||
console.error(`Error converting ${sourceFile}:`, error)
|
||||
return {
|
||||
success: false,
|
||||
analysis: {
|
||||
name: path.basename(sourceFile, '.tsx'),
|
||||
hasHooks: false,
|
||||
hasComplexLogic: false,
|
||||
wrapsUIComponent: false,
|
||||
defaultProps: {},
|
||||
isSimplePresentational: false,
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Convert all components in a directory
|
||||
*/
|
||||
async function convertDirectory(config: ConversionConfig): Promise<void> {
|
||||
const sourceDir = path.join(rootDir, config.sourceDir)
|
||||
const targetDir = path.join(rootDir, config.targetDir)
|
||||
|
||||
console.log(`\n📂 Converting ${config.category} components...`)
|
||||
console.log(` Source: ${sourceDir}`)
|
||||
console.log(` Target: ${targetDir}`)
|
||||
|
||||
// Ensure target directory exists
|
||||
await fs.mkdir(targetDir, { recursive: true })
|
||||
|
||||
// Get all TypeScript files
|
||||
const files = await fs.readdir(sourceDir)
|
||||
const tsxFiles = files.filter(f => f.endsWith('.tsx') && !f.includes('.test.') && !f.includes('.stories.'))
|
||||
|
||||
console.log(` Found ${tsxFiles.length} TypeScript files\n`)
|
||||
|
||||
const results = {
|
||||
total: 0,
|
||||
simple: 0,
|
||||
needsWrapper: 0,
|
||||
wrapsUI: 0,
|
||||
failed: 0,
|
||||
}
|
||||
|
||||
// Convert each file
|
||||
for (const file of tsxFiles) {
|
||||
const sourceFile = path.join(sourceDir, file)
|
||||
const { success, analysis } = await convertFile(sourceFile, targetDir, config.category)
|
||||
|
||||
results.total++
|
||||
|
||||
if (!success) {
|
||||
results.failed++
|
||||
console.log(` ❌ ${file}`)
|
||||
continue
|
||||
}
|
||||
|
||||
if (analysis.wrapsUIComponent) {
|
||||
results.wrapsUI++
|
||||
console.log(` 🎨 ${file} → ${analysis.name} (wraps UI)`)
|
||||
} else if (analysis.isSimplePresentational) {
|
||||
results.simple++
|
||||
console.log(` ✅ ${file} → ${analysis.name} (simple)`)
|
||||
} else if (analysis.hasHooks || analysis.hasComplexLogic) {
|
||||
results.needsWrapper++
|
||||
console.log(` ⚙️ ${file} → ${analysis.name} (needs wrapper)`)
|
||||
} else {
|
||||
results.simple++
|
||||
console.log(` ✅ ${file} → ${analysis.name}`)
|
||||
}
|
||||
}
|
||||
|
||||
console.log(`\n📊 Results for ${config.category}:`)
|
||||
console.log(` Total: ${results.total}`)
|
||||
console.log(` Simple: ${results.simple}`)
|
||||
console.log(` Wraps UI: ${results.wrapsUI}`)
|
||||
console.log(` Needs Wrapper: ${results.needsWrapper}`)
|
||||
console.log(` Failed: ${results.failed}`)
|
||||
}
|
||||
|
||||
/**
|
||||
* Main conversion process
|
||||
*/
|
||||
async function main() {
|
||||
console.log('🚀 Starting TypeScript to JSON conversion...\n')
|
||||
|
||||
const configs: ConversionConfig[] = [
|
||||
{
|
||||
sourceDir: 'src/components/atoms',
|
||||
targetDir: 'src/config/pages/atoms',
|
||||
category: 'atoms',
|
||||
},
|
||||
{
|
||||
sourceDir: 'src/components/molecules',
|
||||
targetDir: 'src/config/pages/molecules',
|
||||
category: 'molecules',
|
||||
},
|
||||
{
|
||||
sourceDir: 'src/components/organisms',
|
||||
targetDir: 'src/config/pages/organisms',
|
||||
category: 'organisms',
|
||||
},
|
||||
{
|
||||
sourceDir: 'src/components/ui',
|
||||
targetDir: 'src/config/pages/ui',
|
||||
category: 'ui',
|
||||
},
|
||||
]
|
||||
|
||||
for (const config of configs) {
|
||||
await convertDirectory(config)
|
||||
}
|
||||
|
||||
console.log('\n✨ Conversion complete!')
|
||||
console.log('\n📝 Next steps:')
|
||||
console.log(' 1. Review generated JSON files')
|
||||
console.log(' 2. Manually fix complex components')
|
||||
console.log(' 3. Update json-components-registry.json')
|
||||
console.log(' 4. Test components render correctly')
|
||||
console.log(' 5. Delete old TypeScript files')
|
||||
}
|
||||
|
||||
main().catch(console.error)
|
||||
91
frontends/codegen/scripts/create-missing-component-jsons.ts
Normal file
91
frontends/codegen/scripts/create-missing-component-jsons.ts
Normal file
@@ -0,0 +1,91 @@
|
||||
import fs from 'node:fs/promises'
|
||||
import path from 'node:path'
|
||||
import { fileURLToPath } from 'node:url'
|
||||
|
||||
const __dirname = path.dirname(fileURLToPath(import.meta.url))
|
||||
const rootDir = path.resolve(__dirname, '..')
|
||||
|
||||
const missingComponents = [
|
||||
'AtomicLibraryShowcase',
|
||||
'CodeEditor',
|
||||
'ComponentTreeBuilder',
|
||||
'ComponentTreeManager',
|
||||
'ConflictResolutionPage',
|
||||
'DockerBuildDebugger',
|
||||
'DocumentationView',
|
||||
'ErrorPanel',
|
||||
'FaviconDesigner',
|
||||
'FeatureIdeaCloud',
|
||||
'FeatureToggleSettings',
|
||||
'JSONComponentTreeManager',
|
||||
'JSONLambdaDesigner',
|
||||
'JSONModelDesigner',
|
||||
'PersistenceDashboard',
|
||||
'PersistenceExample',
|
||||
'ProjectDashboard',
|
||||
'PWASettings',
|
||||
'SassStylesShowcase',
|
||||
'StyleDesigner',
|
||||
]
|
||||
|
||||
async function createComponentJSON(componentName: string) {
|
||||
// Convert to kebab-case for filename
|
||||
const fileName = componentName
|
||||
.replace(/([A-Z])/g, '-$1')
|
||||
.toLowerCase()
|
||||
.replace(/^-/, '') + '.json'
|
||||
|
||||
const filePath = path.join(rootDir, 'src/config/pages/components', fileName)
|
||||
|
||||
// Check if component file exists
|
||||
const possiblePaths = [
|
||||
path.join(rootDir, `src/components/${componentName}.tsx`),
|
||||
path.join(rootDir, `src/components/${componentName}/index.tsx`),
|
||||
]
|
||||
|
||||
let componentPath = ''
|
||||
for (const p of possiblePaths) {
|
||||
try {
|
||||
await fs.access(p)
|
||||
componentPath = `@/components/${componentName}`
|
||||
break
|
||||
} catch {
|
||||
// Continue searching
|
||||
}
|
||||
}
|
||||
|
||||
if (!componentPath) {
|
||||
console.log(` ⚠️ ${componentName} - Component file not found, creating placeholder`)
|
||||
componentPath = `@/components/${componentName}`
|
||||
}
|
||||
|
||||
const json = {
|
||||
type: componentName,
|
||||
jsonCompatible: false,
|
||||
wrapperRequired: true,
|
||||
load: {
|
||||
path: componentPath,
|
||||
export: componentName,
|
||||
},
|
||||
props: {},
|
||||
}
|
||||
|
||||
await fs.writeFile(filePath, JSON.stringify(json, null, 2) + '\n')
|
||||
console.log(` ✅ Created: ${fileName}`)
|
||||
}
|
||||
|
||||
async function main() {
|
||||
console.log('📝 Creating JSON definitions for missing custom components...\n')
|
||||
|
||||
// Ensure directory exists
|
||||
const targetDir = path.join(rootDir, 'src/config/pages/components')
|
||||
await fs.mkdir(targetDir, { recursive: true })
|
||||
|
||||
for (const component of missingComponents) {
|
||||
await createComponentJSON(component)
|
||||
}
|
||||
|
||||
console.log(`\n✨ Created ${missingComponents.length} component JSON files!`)
|
||||
}
|
||||
|
||||
main().catch(console.error)
|
||||
50
frontends/codegen/scripts/delete-packages-folder.sh
Normal file
50
frontends/codegen/scripts/delete-packages-folder.sh
Normal file
@@ -0,0 +1,50 @@
|
||||
#!/bin/bash
|
||||
|
||||
# Delete Packages Folder Script
|
||||
# This script removes the packages folder after verification
|
||||
|
||||
set -e
|
||||
|
||||
echo "🗑️ Deleting packages folder..."
|
||||
echo ""
|
||||
|
||||
# Colors for output
|
||||
RED='\033[0;31m'
|
||||
GREEN='\033[0;32m'
|
||||
YELLOW='\033[1;33m'
|
||||
NC='\033[0m' # No Color
|
||||
|
||||
# Check if packages folder exists
|
||||
if [ ! -d "packages" ]; then
|
||||
echo -e "${YELLOW}⚠${NC} packages folder does not exist"
|
||||
exit 0
|
||||
fi
|
||||
|
||||
# Run verification first
|
||||
echo "Running verification checks..."
|
||||
if bash scripts/verify-packages-removal.sh; then
|
||||
echo ""
|
||||
echo -e "${GREEN}✓${NC} All verification checks passed"
|
||||
echo ""
|
||||
|
||||
# Delete the folder
|
||||
echo "Deleting packages folder..."
|
||||
rm -rf packages
|
||||
|
||||
if [ ! -d "packages" ]; then
|
||||
echo -e "${GREEN}✅ packages folder successfully deleted${NC}"
|
||||
echo ""
|
||||
echo "Next steps:"
|
||||
echo " 1. Test the build: npm run build"
|
||||
echo " 2. Test Docker build: docker build -t codeforge ."
|
||||
echo " 3. Commit the changes: git add -A && git commit -m 'Remove packages folder'"
|
||||
else
|
||||
echo -e "${RED}❌ Failed to delete packages folder${NC}"
|
||||
exit 1
|
||||
fi
|
||||
else
|
||||
echo ""
|
||||
echo -e "${RED}❌ Verification failed${NC}"
|
||||
echo "Cannot delete packages folder - fix errors first"
|
||||
exit 1
|
||||
fi
|
||||
120
frontends/codegen/scripts/diagnose-502.sh
Normal file
120
frontends/codegen/scripts/diagnose-502.sh
Normal file
@@ -0,0 +1,120 @@
|
||||
#!/bin/bash
|
||||
|
||||
# 502 Error Troubleshooting Script for Codespaces
|
||||
# This script helps diagnose and fix common Vite dev server issues
|
||||
|
||||
echo "🔍 Diagnosing 502 Bad Gateway Issues..."
|
||||
echo ""
|
||||
|
||||
# Check if port 5000 is in use
|
||||
echo "1️⃣ Checking if port 5000 is in use..."
|
||||
if lsof -i :5000 >/dev/null 2>&1; then
|
||||
echo " ✅ Port 5000 is in use (server running)"
|
||||
lsof -i :5000 | grep LISTEN
|
||||
else
|
||||
echo " ❌ Port 5000 is NOT in use (server not running)"
|
||||
echo " → Run 'npm run dev' to start the server"
|
||||
fi
|
||||
echo ""
|
||||
|
||||
# Check if port 5173 is in use (old default)
|
||||
echo "2️⃣ Checking if port 5173 is in use (old default)..."
|
||||
if lsof -i :5173 >/dev/null 2>&1; then
|
||||
echo " ⚠️ Port 5173 is in use - this is the OLD port!"
|
||||
echo " → Kill this process and restart with updated config"
|
||||
lsof -i :5173 | grep LISTEN
|
||||
else
|
||||
echo " ✅ Port 5173 is not in use"
|
||||
fi
|
||||
echo ""
|
||||
|
||||
# Check vite.config.ts for correct port
|
||||
echo "3️⃣ Checking vite.config.ts for correct port..."
|
||||
if grep -q "port: 5000" vite.config.ts; then
|
||||
echo " ✅ vite.config.ts is configured for port 5000"
|
||||
else
|
||||
echo " ❌ vite.config.ts is NOT configured for port 5000"
|
||||
echo " → Update vite.config.ts to use port 5000"
|
||||
fi
|
||||
echo ""
|
||||
|
||||
# Check if server binds to 0.0.0.0
|
||||
echo "4️⃣ Checking if server binds to 0.0.0.0..."
|
||||
if grep -q "host: '0.0.0.0'" vite.config.ts || grep -q 'host: "0.0.0.0"' vite.config.ts; then
|
||||
echo " ✅ Server configured to bind to 0.0.0.0 (externally accessible)"
|
||||
else
|
||||
echo " ❌ Server NOT configured to bind to 0.0.0.0"
|
||||
echo " → Update vite.config.ts to include host: '0.0.0.0'"
|
||||
fi
|
||||
echo ""
|
||||
|
||||
# Check for node processes
|
||||
echo "5️⃣ Checking for running node processes..."
|
||||
NODE_PROCS=$(pgrep -f "node.*vite" | wc -l)
|
||||
if [ "$NODE_PROCS" -gt 0 ]; then
|
||||
echo " ✅ Found $NODE_PROCS Vite node process(es)"
|
||||
ps aux | grep "node.*vite" | grep -v grep
|
||||
else
|
||||
echo " ❌ No Vite node processes found"
|
||||
echo " → Dev server is not running"
|
||||
fi
|
||||
echo ""
|
||||
|
||||
# Check package.json for workspace dependencies
|
||||
echo "6️⃣ Checking for workspace dependencies..."
|
||||
if grep -q '"@github/spark": "workspace:' package.json; then
|
||||
echo " ℹ️ Found workspace dependencies in package.json"
|
||||
echo " → This requires 'npm install' instead of 'npm ci'"
|
||||
echo " → Or switch to pnpm for better workspace support"
|
||||
else
|
||||
echo " ✅ No workspace dependencies found"
|
||||
fi
|
||||
echo ""
|
||||
|
||||
# Check if dependencies are installed
|
||||
echo "7️⃣ Checking if node_modules exists..."
|
||||
if [ -d "node_modules" ]; then
|
||||
echo " ✅ node_modules directory exists"
|
||||
if [ -d "node_modules/.vite" ]; then
|
||||
echo " ✅ Vite cache exists"
|
||||
else
|
||||
echo " ⚠️ Vite cache doesn't exist yet (first run)"
|
||||
fi
|
||||
else
|
||||
echo " ❌ node_modules directory NOT found"
|
||||
echo " → Run 'npm install' to install dependencies"
|
||||
fi
|
||||
echo ""
|
||||
|
||||
# Summary and recommendations
|
||||
echo "📋 SUMMARY & RECOMMENDATIONS"
|
||||
echo "════════════════════════════════════════════════════════════"
|
||||
|
||||
# Determine main issue
|
||||
if ! lsof -i :5000 >/dev/null 2>&1; then
|
||||
echo "❌ MAIN ISSUE: Dev server is not running on port 5000"
|
||||
echo ""
|
||||
echo "🔧 TO FIX:"
|
||||
echo " 1. Kill any existing dev servers: npm run kill"
|
||||
echo " 2. Start the dev server: npm run dev"
|
||||
echo " 3. Wait for 'ready' message with port 5000"
|
||||
echo " 4. Open the forwarded Codespaces URL"
|
||||
elif lsof -i :5173 >/dev/null 2>&1; then
|
||||
echo "⚠️ MAIN ISSUE: Server running on wrong port (5173 instead of 5000)"
|
||||
echo ""
|
||||
echo "🔧 TO FIX:"
|
||||
echo " 1. Stop the current server (Ctrl+C)"
|
||||
echo " 2. Verify vite.config.ts has 'port: 5000'"
|
||||
echo " 3. Restart: npm run dev"
|
||||
else
|
||||
echo "✅ Configuration looks correct!"
|
||||
echo ""
|
||||
echo "If you're still seeing 502 errors:"
|
||||
echo " 1. Check Codespaces Ports panel"
|
||||
echo " 2. Verify port 5000 is forwarded and PUBLIC"
|
||||
echo " 3. Try opening the forwarded URL again"
|
||||
echo " 4. Check browser console for detailed errors"
|
||||
fi
|
||||
|
||||
echo ""
|
||||
echo "📚 For more details, see: docs/502_ERROR_FIX.md"
|
||||
141
frontends/codegen/scripts/find-component-imports.ts
Normal file
141
frontends/codegen/scripts/find-component-imports.ts
Normal file
@@ -0,0 +1,141 @@
|
||||
import fs from 'node:fs/promises'
|
||||
import path from 'node:path'
|
||||
import { fileURLToPath } from 'node:url'
|
||||
|
||||
const __dirname = path.dirname(fileURLToPath(import.meta.url))
|
||||
const rootDir = path.resolve(__dirname, '..')
|
||||
|
||||
// Components we want to remove (restored dependencies)
|
||||
const targetComponents = {
|
||||
ui: ['accordion', 'alert', 'aspect-ratio', 'avatar', 'badge', 'button', 'card',
|
||||
'checkbox', 'collapsible', 'dialog', 'hover-card', 'input', 'label',
|
||||
'popover', 'progress', 'radio-group', 'resizable', 'scroll-area',
|
||||
'separator', 'skeleton', 'sheet', 'switch', 'tabs', 'textarea', 'toggle', 'tooltip'],
|
||||
molecules: ['DataSourceCard', 'EditorToolbar', 'EmptyEditorState', 'MonacoEditorPanel', 'SearchBar'],
|
||||
organisms: ['EmptyCanvasState', 'PageHeader', 'SchemaEditorCanvas', 'SchemaEditorPropertiesPanel',
|
||||
'SchemaEditorSidebar', 'SchemaEditorStatusBar', 'SchemaEditorToolbar', 'ToolbarActions'],
|
||||
atoms: ['Input']
|
||||
}
|
||||
|
||||
interface ImportInfo {
|
||||
file: string
|
||||
line: number
|
||||
importStatement: string
|
||||
importedComponents: string[]
|
||||
fromPath: string
|
||||
}
|
||||
|
||||
async function findAllImports(): Promise<ImportInfo[]> {
|
||||
const imports: ImportInfo[] = []
|
||||
|
||||
const searchDirs = [
|
||||
'src/components',
|
||||
'src/pages',
|
||||
'src/lib',
|
||||
'src'
|
||||
]
|
||||
|
||||
for (const dir of searchDirs) {
|
||||
const dirPath = path.join(rootDir, dir)
|
||||
try {
|
||||
await processDirectory(dirPath, imports)
|
||||
} catch (e) {
|
||||
// Directory might not exist, skip
|
||||
}
|
||||
}
|
||||
|
||||
return imports
|
||||
}
|
||||
|
||||
async function processDirectory(dir: string, imports: ImportInfo[]): Promise<void> {
|
||||
const entries = await fs.readdir(dir, { withFileTypes: true })
|
||||
|
||||
for (const entry of entries) {
|
||||
const fullPath = path.join(dir, entry.name)
|
||||
|
||||
if (entry.isDirectory() && !entry.name.includes('node_modules')) {
|
||||
await processDirectory(fullPath, imports)
|
||||
} else if (entry.isFile() && (entry.name.endsWith('.tsx') || entry.name.endsWith('.ts'))) {
|
||||
await processFile(fullPath, imports)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
async function processFile(filePath: string, imports: ImportInfo[]): Promise<void> {
|
||||
const content = await fs.readFile(filePath, 'utf-8')
|
||||
const lines = content.split('\n')
|
||||
|
||||
for (let i = 0; i < lines.length; i++) {
|
||||
const line = lines[i]
|
||||
|
||||
// Check for imports from our target components
|
||||
for (const [category, components] of Object.entries(targetComponents)) {
|
||||
for (const component of components) {
|
||||
const patterns = [
|
||||
`from ['"]@/components/${category}/${component}['"]`,
|
||||
`from ['"]./${component}['"]`,
|
||||
`from ['"]../${component}['"]`,
|
||||
]
|
||||
|
||||
for (const pattern of patterns) {
|
||||
if (new RegExp(pattern).test(line)) {
|
||||
// Extract imported components
|
||||
const importMatch = line.match(/import\s+(?:\{([^}]+)\}|(\w+))\s+from/)
|
||||
const importedComponents = importMatch
|
||||
? (importMatch[1] || importMatch[2]).split(',').map(s => s.trim())
|
||||
: []
|
||||
|
||||
imports.push({
|
||||
file: filePath.replace(rootDir, '').replace(/\\/g, '/'),
|
||||
line: i + 1,
|
||||
importStatement: line.trim(),
|
||||
importedComponents,
|
||||
fromPath: component
|
||||
})
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
async function main() {
|
||||
console.log('🔍 Finding all imports of target components...\n')
|
||||
|
||||
const imports = await findAllImports()
|
||||
|
||||
if (imports.length === 0) {
|
||||
console.log('✅ No imports found! Components can be safely deleted.')
|
||||
return
|
||||
}
|
||||
|
||||
console.log(`❌ Found ${imports.length} imports that need refactoring:\n`)
|
||||
|
||||
const byFile: Record<string, ImportInfo[]> = {}
|
||||
for (const imp of imports) {
|
||||
if (!byFile[imp.file]) byFile[imp.file] = []
|
||||
byFile[imp.file].push(imp)
|
||||
}
|
||||
|
||||
for (const [file, fileImports] of Object.entries(byFile)) {
|
||||
console.log(`📄 ${file}`)
|
||||
for (const imp of fileImports) {
|
||||
console.log(` Line ${imp.line}: ${imp.importStatement}`)
|
||||
console.log(` → Imports: ${imp.importedComponents.join(', ')}`)
|
||||
}
|
||||
console.log()
|
||||
}
|
||||
|
||||
console.log('\n📊 Summary by category:')
|
||||
const byCategory: Record<string, number> = {}
|
||||
for (const imp of imports) {
|
||||
const key = imp.fromPath
|
||||
byCategory[key] = (byCategory[key] || 0) + 1
|
||||
}
|
||||
|
||||
for (const [component, count] of Object.entries(byCategory).sort((a, b) => b[1] - a[1])) {
|
||||
console.log(` ${component}: ${count} imports`)
|
||||
}
|
||||
}
|
||||
|
||||
main().catch(console.error)
|
||||
73
frontends/codegen/scripts/fix-502.sh
Normal file
73
frontends/codegen/scripts/fix-502.sh
Normal file
@@ -0,0 +1,73 @@
|
||||
#!/bin/bash
|
||||
|
||||
# Quick fix script for 502 Bad Gateway errors
|
||||
# This script automates the common fix steps
|
||||
|
||||
echo "🔧 502 Bad Gateway Quick Fix"
|
||||
echo "════════════════════════════════════════════════════════════"
|
||||
echo ""
|
||||
|
||||
# Step 1: Kill existing processes
|
||||
echo "1️⃣ Killing existing processes on port 5000..."
|
||||
if lsof -i :5000 >/dev/null 2>&1; then
|
||||
fuser -k 5000/tcp 2>/dev/null || true
|
||||
sleep 1
|
||||
echo " ✅ Killed processes on port 5000"
|
||||
else
|
||||
echo " ℹ️ No processes on port 5000"
|
||||
fi
|
||||
echo ""
|
||||
|
||||
# Step 2: Kill processes on old port (5173)
|
||||
echo "2️⃣ Killing processes on old port (5173)..."
|
||||
if lsof -i :5173 >/dev/null 2>&1; then
|
||||
fuser -k 5173/tcp 2>/dev/null || true
|
||||
sleep 1
|
||||
echo " ✅ Killed processes on port 5173"
|
||||
else
|
||||
echo " ℹ️ No processes on port 5173"
|
||||
fi
|
||||
echo ""
|
||||
|
||||
# Step 3: Verify vite.config.ts
|
||||
echo "3️⃣ Verifying vite.config.ts..."
|
||||
if grep -q "port: 5000" vite.config.ts; then
|
||||
echo " ✅ Configuration correct (port 5000)"
|
||||
else
|
||||
echo " ❌ Configuration incorrect!"
|
||||
echo " → Manual fix needed: Update vite.config.ts port to 5000"
|
||||
exit 1
|
||||
fi
|
||||
echo ""
|
||||
|
||||
# Step 4: Check dependencies
|
||||
echo "4️⃣ Checking dependencies..."
|
||||
if [ ! -d "node_modules" ]; then
|
||||
echo " ⚠️ node_modules not found, installing dependencies..."
|
||||
npm install
|
||||
echo " ✅ Dependencies installed"
|
||||
else
|
||||
echo " ✅ Dependencies present"
|
||||
fi
|
||||
echo ""
|
||||
|
||||
# Step 5: Clear Vite cache
|
||||
echo "5️⃣ Clearing Vite cache..."
|
||||
if [ -d "node_modules/.vite" ]; then
|
||||
rm -rf node_modules/.vite
|
||||
echo " ✅ Vite cache cleared"
|
||||
else
|
||||
echo " ℹ️ No Vite cache to clear"
|
||||
fi
|
||||
echo ""
|
||||
|
||||
# Step 6: Start dev server
|
||||
echo "6️⃣ Starting dev server..."
|
||||
echo ""
|
||||
echo "════════════════════════════════════════════════════════════"
|
||||
echo "🚀 Starting Vite dev server on port 5000..."
|
||||
echo " Press Ctrl+C to stop"
|
||||
echo "════════════════════════════════════════════════════════════"
|
||||
echo ""
|
||||
|
||||
npm run dev
|
||||
41
frontends/codegen/scripts/fix-index-files.ts
Normal file
41
frontends/codegen/scripts/fix-index-files.ts
Normal file
@@ -0,0 +1,41 @@
|
||||
#!/usr/bin/env tsx
|
||||
/**
|
||||
* Fix index.ts files to only export existing TSX files
|
||||
*/
|
||||
|
||||
import fs from 'fs'
|
||||
import path from 'path'
|
||||
import { globSync } from 'fs'
|
||||
|
||||
const ROOT_DIR = path.resolve(process.cwd())
|
||||
const COMPONENTS_DIR = path.join(ROOT_DIR, 'src/components')
|
||||
|
||||
const categories = ['atoms', 'molecules', 'organisms']
|
||||
|
||||
for (const category of categories) {
|
||||
const categoryDir = path.join(COMPONENTS_DIR, category)
|
||||
const indexPath = path.join(categoryDir, 'index.ts')
|
||||
|
||||
if (!fs.existsSync(indexPath)) continue
|
||||
|
||||
// Find all TSX files in this category
|
||||
const tsxFiles = globSync(path.join(categoryDir, '*.tsx'))
|
||||
const basenames = tsxFiles.map(f => path.basename(f, '.tsx'))
|
||||
|
||||
console.log(`\n📁 ${category}/`)
|
||||
console.log(` Found ${basenames.length} TSX files`)
|
||||
|
||||
// Generate new exports
|
||||
const exports = basenames
|
||||
.sort()
|
||||
.map(name => `export { ${name} } from './${name}'`)
|
||||
.join('\n')
|
||||
|
||||
// Write new index file
|
||||
const content = `// Auto-generated - only exports existing TSX files\n${exports}\n`
|
||||
fs.writeFileSync(indexPath, content)
|
||||
|
||||
console.log(` ✅ Updated ${category}/index.ts`)
|
||||
}
|
||||
|
||||
console.log('\n✨ All index files updated!')
|
||||
@@ -0,0 +1,50 @@
|
||||
import fs from 'fs'
|
||||
import path from 'path'
|
||||
import { fileURLToPath } from 'url'
|
||||
|
||||
interface RegistryComponent {
|
||||
type?: string
|
||||
name?: string
|
||||
export?: string
|
||||
}
|
||||
|
||||
interface RegistryData {
|
||||
components?: RegistryComponent[]
|
||||
}
|
||||
|
||||
const __dirname = path.dirname(fileURLToPath(import.meta.url))
|
||||
const rootDir = path.resolve(__dirname, '..')
|
||||
const registryPath = path.join(rootDir, 'json-components-registry.json')
|
||||
const outputPath = path.join(rootDir, 'src/types/json-ui-component-types.ts')
|
||||
|
||||
const registryData = JSON.parse(fs.readFileSync(registryPath, 'utf8')) as RegistryData
|
||||
const components = registryData.components ?? []
|
||||
|
||||
const seen = new Set<string>()
|
||||
const componentTypes = components.flatMap((component) => {
|
||||
const typeName = component.type ?? component.name ?? component.export
|
||||
if (!typeName || typeof typeName !== 'string') {
|
||||
throw new Error('Registry component is missing a valid type/name/export entry.')
|
||||
}
|
||||
if (seen.has(typeName)) {
|
||||
return []
|
||||
}
|
||||
seen.add(typeName)
|
||||
return [typeName]
|
||||
})
|
||||
|
||||
const lines = [
|
||||
'// This file is auto-generated by scripts/generate-json-ui-component-types.ts.',
|
||||
'// Do not edit this file directly.',
|
||||
'',
|
||||
'export const jsonUIComponentTypes = [',
|
||||
...componentTypes.map((typeName) => ` ${JSON.stringify(typeName)},`),
|
||||
'] as const',
|
||||
'',
|
||||
'export type JSONUIComponentType = typeof jsonUIComponentTypes[number]',
|
||||
'',
|
||||
]
|
||||
|
||||
fs.writeFileSync(outputPath, `${lines.join('\n')}`)
|
||||
|
||||
console.log(`✅ Wrote ${componentTypes.length} component types to ${outputPath}`)
|
||||
135
frontends/codegen/scripts/generate-page.js
Normal file
135
frontends/codegen/scripts/generate-page.js
Normal file
@@ -0,0 +1,135 @@
|
||||
#!/usr/bin/env node
|
||||
|
||||
/**
|
||||
* Page Generator Script
|
||||
*
|
||||
* Generates boilerplate code for adding a new page to CodeForge.
|
||||
*
|
||||
* Usage:
|
||||
* node scripts/generate-page.js MyNewDesigner "My New Designer" "Sparkle"
|
||||
*
|
||||
* This will create:
|
||||
* - Component file
|
||||
* - JSON configuration snippet
|
||||
* - Props mapping snippet
|
||||
* - ComponentMap entry snippet
|
||||
*/
|
||||
|
||||
const fs = require('fs')
|
||||
const path = require('path')
|
||||
|
||||
const args = process.argv.slice(2)
|
||||
|
||||
if (args.length < 3) {
|
||||
console.error('Usage: node scripts/generate-page.js <ComponentName> <Title> <Icon> [toggleKey] [shortcut]')
|
||||
console.error('Example: node scripts/generate-page.js MyNewDesigner "My New Designer" "Sparkle" "myNewFeature" "ctrl+shift+n"')
|
||||
process.exit(1)
|
||||
}
|
||||
|
||||
const [componentName, title, icon, toggleKey, shortcut] = args
|
||||
|
||||
const kebabCase = (str) => str.replace(/([a-z0-9])([A-Z])/g, '$1-$2').toLowerCase()
|
||||
const pageId = kebabCase(componentName)
|
||||
|
||||
const componentTemplate = `export function ${componentName}() {
|
||||
return (
|
||||
<div className="h-full flex flex-col bg-background">
|
||||
<div className="flex-1 overflow-auto p-6">
|
||||
<div className="max-w-6xl mx-auto space-y-6">
|
||||
<div>
|
||||
<h1 className="text-3xl font-bold">${title}</h1>
|
||||
<p className="text-muted-foreground mt-2">
|
||||
Add your description here
|
||||
</p>
|
||||
</div>
|
||||
|
||||
<div className="border border-border rounded-lg p-6">
|
||||
<p className="text-center text-muted-foreground">
|
||||
Start building your ${title.toLowerCase()} here
|
||||
</p>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
)
|
||||
}
|
||||
`
|
||||
|
||||
const nextOrder = 21
|
||||
|
||||
const pageConfigSnippet = `{
|
||||
"id": "${pageId}",
|
||||
"title": "${title}",
|
||||
"icon": "${icon}",
|
||||
"component": "${componentName}",
|
||||
"enabled": true,${toggleKey ? `\n "toggleKey": "${toggleKey}",` : ''}${shortcut ? `\n "shortcut": "${shortcut}",` : ''}
|
||||
"order": ${nextOrder}
|
||||
}`
|
||||
|
||||
const componentMapSnippet = ` ${componentName}: lazy(() => import('@/components/${componentName}').then(m => ({ default: m.${componentName} }))),`
|
||||
|
||||
const propsMapSnippet = ` '${componentName}': {
|
||||
// Add your props here
|
||||
},`
|
||||
|
||||
const featureToggleSnippet = toggleKey ? ` ${toggleKey}: boolean` : null
|
||||
|
||||
console.log('\n🎨 CodeForge Page Generator\n')
|
||||
console.log('═══════════════════════════════════════\n')
|
||||
|
||||
console.log('📁 Component file will be created at:')
|
||||
console.log(` src/components/${componentName}.tsx\n`)
|
||||
|
||||
console.log('📝 Component code:')
|
||||
console.log('───────────────────────────────────────')
|
||||
console.log(componentTemplate)
|
||||
console.log('───────────────────────────────────────\n')
|
||||
|
||||
console.log('⚙️ Add this to src/config/pages.json:')
|
||||
console.log('───────────────────────────────────────')
|
||||
console.log(pageConfigSnippet)
|
||||
console.log('───────────────────────────────────────\n')
|
||||
|
||||
console.log('🗺️ Add this to componentMap in src/App.tsx:')
|
||||
console.log('───────────────────────────────────────')
|
||||
console.log(componentMapSnippet)
|
||||
console.log('───────────────────────────────────────\n')
|
||||
|
||||
console.log('🔧 Add this to getPropsForComponent in src/App.tsx:')
|
||||
console.log('───────────────────────────────────────')
|
||||
console.log(propsMapSnippet)
|
||||
console.log('───────────────────────────────────────\n')
|
||||
|
||||
if (featureToggleSnippet) {
|
||||
console.log('🎚️ Add this to FeatureToggles in src/types/project.ts:')
|
||||
console.log('───────────────────────────────────────')
|
||||
console.log(featureToggleSnippet)
|
||||
console.log('───────────────────────────────────────\n')
|
||||
}
|
||||
|
||||
console.log('✅ Next Steps:')
|
||||
console.log(' 1. Create the component file')
|
||||
console.log(' 2. Add configuration to pages.json')
|
||||
console.log(' 3. Add component to componentMap')
|
||||
console.log(' 4. (Optional) Add props mapping')
|
||||
if (featureToggleSnippet) {
|
||||
console.log(' 5. Add feature toggle type and default value')
|
||||
}
|
||||
console.log('\n')
|
||||
|
||||
const componentPath = path.join(process.cwd(), 'src', 'components', `${componentName}.tsx`)
|
||||
|
||||
if (fs.existsSync(componentPath)) {
|
||||
console.log(`⚠️ Warning: ${componentPath} already exists. Skipping file creation.`)
|
||||
} else {
|
||||
const createFile = process.argv.includes('--create')
|
||||
|
||||
if (createFile) {
|
||||
fs.writeFileSync(componentPath, componentTemplate, 'utf8')
|
||||
console.log(`✅ Created ${componentPath}`)
|
||||
} else {
|
||||
console.log('💡 Run with --create flag to automatically create the component file')
|
||||
}
|
||||
}
|
||||
|
||||
console.log('\n')
|
||||
127
frontends/codegen/scripts/identify-pure-json-components.ts
Normal file
127
frontends/codegen/scripts/identify-pure-json-components.ts
Normal file
@@ -0,0 +1,127 @@
|
||||
import fs from 'node:fs/promises'
|
||||
import path from 'node:path'
|
||||
import { fileURLToPath } from 'node:url'
|
||||
|
||||
const __dirname = path.dirname(fileURLToPath(import.meta.url))
|
||||
const rootDir = path.resolve(__dirname, '..')
|
||||
|
||||
// Components we restored (the ones we want to potentially convert to JSON)
|
||||
const restoredComponents = {
|
||||
ui: ['accordion', 'alert', 'aspect-ratio', 'avatar', 'badge', 'button', 'card',
|
||||
'checkbox', 'collapsible', 'dialog', 'hover-card', 'input', 'label',
|
||||
'popover', 'progress', 'radio-group', 'resizable', 'scroll-area',
|
||||
'separator', 'skeleton', 'sheet', 'switch', 'tabs', 'textarea', 'toggle', 'tooltip'],
|
||||
molecules: ['DataSourceCard', 'EditorToolbar', 'EmptyEditorState', 'MonacoEditorPanel', 'SearchBar'],
|
||||
organisms: ['EmptyCanvasState', 'PageHeader', 'SchemaEditorCanvas', 'SchemaEditorPropertiesPanel',
|
||||
'SchemaEditorSidebar', 'SchemaEditorStatusBar', 'SchemaEditorToolbar', 'ToolbarActions'],
|
||||
atoms: ['Input'],
|
||||
}
|
||||
|
||||
interface ComponentAnalysis {
|
||||
name: string
|
||||
category: string
|
||||
pureJSONEligible: boolean
|
||||
reasons: string[]
|
||||
complexity: 'simple' | 'medium' | 'complex'
|
||||
hasHooks: boolean
|
||||
hasConditionalLogic: boolean
|
||||
hasHelperFunctions: boolean
|
||||
hasComplexProps: boolean
|
||||
importsCustomComponents: boolean
|
||||
onlyImportsUIorAtoms: boolean
|
||||
}
|
||||
|
||||
async function analyzeComponent(category: string, component: string): Promise<ComponentAnalysis> {
|
||||
const tsFile = path.join(rootDir, `src/components/${category}/${component}.tsx`)
|
||||
const content = await fs.readFile(tsFile, 'utf-8')
|
||||
|
||||
const hasHooks = /useState|useEffect|useCallback|useMemo|useReducer|useRef|useContext/.test(content)
|
||||
const hasConditionalLogic = /\?|if\s*\(|switch\s*\(/.test(content)
|
||||
const hasHelperFunctions = /(?:const|function)\s+\w+\s*=\s*\([^)]*\)\s*=>/.test(content) && /return\s+\(/.test(content.split('return (')[0] || '')
|
||||
const hasComplexProps = /\.\w+\s*\?/.test(content) || /Object\./.test(content) || /Array\./.test(content)
|
||||
|
||||
// Check imports
|
||||
const importLines = content.match(/import\s+.*?\s+from\s+['"](.*?)['"]/g) || []
|
||||
const importsCustomComponents = importLines.some(line =>
|
||||
/@\/components\/(molecules|organisms)/.test(line)
|
||||
)
|
||||
const onlyImportsUIorAtoms = importLines.every(line => {
|
||||
if (!line.includes('@/components/')) return true
|
||||
return /@\/components\/(ui|atoms)/.test(line)
|
||||
})
|
||||
|
||||
const reasons: string[] = []
|
||||
if (hasHooks) reasons.push('Has React hooks')
|
||||
if (hasHelperFunctions) reasons.push('Has helper functions')
|
||||
if (hasComplexProps) reasons.push('Has complex prop access')
|
||||
if (importsCustomComponents) reasons.push('Imports molecules/organisms')
|
||||
if (!onlyImportsUIorAtoms && !importsCustomComponents) reasons.push('Imports non-UI components')
|
||||
|
||||
// Determine if eligible for pure JSON
|
||||
const pureJSONEligible = !hasHooks && !hasHelperFunctions && !hasComplexProps && onlyImportsUIorAtoms
|
||||
|
||||
// Complexity scoring
|
||||
let complexity: 'simple' | 'medium' | 'complex' = 'simple'
|
||||
if (hasHooks || hasHelperFunctions || hasComplexProps) {
|
||||
complexity = 'complex'
|
||||
} else if (hasConditionalLogic || importsCustomComponents) {
|
||||
complexity = 'medium'
|
||||
}
|
||||
|
||||
return {
|
||||
name: component,
|
||||
category,
|
||||
pureJSONEligible,
|
||||
reasons,
|
||||
complexity,
|
||||
hasHooks,
|
||||
hasConditionalLogic,
|
||||
hasHelperFunctions,
|
||||
hasComplexProps,
|
||||
importsCustomComponents,
|
||||
onlyImportsUIorAtoms,
|
||||
}
|
||||
}
|
||||
|
||||
async function main() {
|
||||
console.log('🔍 Analyzing restored components for pure JSON eligibility...\\n')
|
||||
|
||||
const eligible: ComponentAnalysis[] = []
|
||||
const ineligible: ComponentAnalysis[] = []
|
||||
|
||||
for (const [category, components] of Object.entries(restoredComponents)) {
|
||||
for (const component of components) {
|
||||
try {
|
||||
const analysis = await analyzeComponent(category, component)
|
||||
if (analysis.pureJSONEligible) {
|
||||
eligible.push(analysis)
|
||||
} else {
|
||||
ineligible.push(analysis)
|
||||
}
|
||||
} catch (e) {
|
||||
console.log(`⚠️ ${component} - Could not analyze: ${e}`)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
console.log(`\\n✅ ELIGIBLE FOR PURE JSON (${eligible.length} components)\\n`)
|
||||
for (const comp of eligible) {
|
||||
console.log(` ${comp.name} (${comp.category})`)
|
||||
console.log(` Complexity: ${comp.complexity}`)
|
||||
console.log(` Conditional: ${comp.hasConditionalLogic ? 'Yes' : 'No'}`)
|
||||
}
|
||||
|
||||
console.log(`\\n❌ MUST STAY TYPESCRIPT (${ineligible.length} components)\\n`)
|
||||
for (const comp of ineligible) {
|
||||
console.log(` ${comp.name} (${comp.category})`)
|
||||
console.log(` Complexity: ${comp.complexity}`)
|
||||
console.log(` Reasons: ${comp.reasons.join(', ')}`)
|
||||
}
|
||||
|
||||
console.log(`\\n📊 Summary:`)
|
||||
console.log(` Eligible for JSON: ${eligible.length}`)
|
||||
console.log(` Must stay TypeScript: ${ineligible.length}`)
|
||||
console.log(` Conversion rate: ${Math.round(eligible.length / (eligible.length + ineligible.length) * 100)}%`)
|
||||
}
|
||||
|
||||
main().catch(console.error)
|
||||
252
frontends/codegen/scripts/lint-json-ui-schemas.cjs
Normal file
252
frontends/codegen/scripts/lint-json-ui-schemas.cjs
Normal file
@@ -0,0 +1,252 @@
|
||||
const fs = require('fs')
|
||||
const path = require('path')
|
||||
|
||||
const rootDir = path.resolve(__dirname, '..')
|
||||
const definitionsPath = path.join(rootDir, 'src', 'lib', 'component-definitions.json')
|
||||
const schemaDirs = [
|
||||
path.join(rootDir, 'src', 'schemas'),
|
||||
path.join(rootDir, 'public', 'schemas'),
|
||||
]
|
||||
|
||||
const commonProps = new Set(['className', 'style', 'children'])
|
||||
const bindingSourceTypes = new Set(['data', 'bindings', 'state'])
|
||||
|
||||
const readJson = (filePath) => JSON.parse(fs.readFileSync(filePath, 'utf8'))
|
||||
const fileExists = (filePath) => fs.existsSync(filePath)
|
||||
|
||||
const componentDefinitions = readJson(definitionsPath)
|
||||
const definitionsByType = new Map(
|
||||
componentDefinitions
|
||||
.filter((definition) => definition.type)
|
||||
.map((definition) => [definition.type, definition])
|
||||
)
|
||||
|
||||
const errors = []
|
||||
|
||||
const reportError = (file, pathLabel, message) => {
|
||||
errors.push({ file, path: pathLabel, message })
|
||||
}
|
||||
|
||||
const collectSchemaFiles = (dirs) => {
|
||||
const files = []
|
||||
dirs.forEach((dir) => {
|
||||
if (!fileExists(dir)) return
|
||||
fs.readdirSync(dir).forEach((entry) => {
|
||||
if (!entry.endsWith('.json')) return
|
||||
files.push(path.join(dir, entry))
|
||||
})
|
||||
})
|
||||
return files
|
||||
}
|
||||
|
||||
const isPageSchema = (schema) =>
|
||||
schema
|
||||
&& typeof schema === 'object'
|
||||
&& schema.layout
|
||||
&& Array.isArray(schema.components)
|
||||
|
||||
const extractSchemas = (data, filePath) => {
|
||||
if (isPageSchema(data)) {
|
||||
return [{ name: filePath, schema: data }]
|
||||
}
|
||||
|
||||
if (data && typeof data === 'object') {
|
||||
const schemas = Object.entries(data)
|
||||
.filter(([, value]) => isPageSchema(value))
|
||||
.map(([key, value]) => ({ name: `${filePath}:${key}`, schema: value }))
|
||||
if (schemas.length > 0) {
|
||||
return schemas
|
||||
}
|
||||
}
|
||||
|
||||
return []
|
||||
}
|
||||
|
||||
const validateBindings = (bindings, fileLabel, pathLabel, contextVars, dataSourceIds, definition) => {
|
||||
if (!bindings) return
|
||||
|
||||
const propDefinitions = definition?.props
|
||||
? new Map(definition.props.map((prop) => [prop.name, prop]))
|
||||
: null
|
||||
|
||||
Object.entries(bindings).forEach(([propName, binding]) => {
|
||||
if (propDefinitions) {
|
||||
if (!propDefinitions.has(propName) && !commonProps.has(propName)) {
|
||||
reportError(fileLabel, `${pathLabel}.bindings.${propName}`, `Invalid binding for unknown prop "${propName}"`)
|
||||
return
|
||||
}
|
||||
|
||||
const propDefinition = propDefinitions.get(propName)
|
||||
if (propDefinition && propDefinition.supportsBinding !== true) {
|
||||
reportError(fileLabel, `${pathLabel}.bindings.${propName}`, `Binding not supported for prop "${propName}"`)
|
||||
}
|
||||
}
|
||||
|
||||
if (binding && typeof binding === 'object') {
|
||||
const sourceType = binding.sourceType ?? 'data'
|
||||
if (!bindingSourceTypes.has(sourceType)) {
|
||||
reportError(
|
||||
fileLabel,
|
||||
`${pathLabel}.bindings.${propName}.sourceType`,
|
||||
`Unsupported binding sourceType "${sourceType}"`
|
||||
)
|
||||
}
|
||||
|
||||
const source = binding.source
|
||||
if (source && sourceType !== 'state') {
|
||||
const isKnownSource = dataSourceIds.has(source) || contextVars.has(source)
|
||||
if (!isKnownSource) {
|
||||
reportError(
|
||||
fileLabel,
|
||||
`${pathLabel}.bindings.${propName}.source`,
|
||||
`Binding source "${source}" is not defined in dataSources or loop context`
|
||||
)
|
||||
}
|
||||
}
|
||||
}
|
||||
})
|
||||
}
|
||||
|
||||
const validateDataBinding = (dataBinding, fileLabel, pathLabel, contextVars, dataSourceIds) => {
|
||||
if (!dataBinding || typeof dataBinding !== 'object') return
|
||||
|
||||
const sourceType = dataBinding.sourceType ?? 'data'
|
||||
if (!bindingSourceTypes.has(sourceType)) {
|
||||
reportError(
|
||||
fileLabel,
|
||||
`${pathLabel}.dataBinding.sourceType`,
|
||||
`Unsupported dataBinding sourceType "${sourceType}"`
|
||||
)
|
||||
}
|
||||
|
||||
if (dataBinding.source && sourceType !== 'state') {
|
||||
const isKnownSource = dataSourceIds.has(dataBinding.source) || contextVars.has(dataBinding.source)
|
||||
if (!isKnownSource) {
|
||||
reportError(
|
||||
fileLabel,
|
||||
`${pathLabel}.dataBinding.source`,
|
||||
`Data binding source "${dataBinding.source}" is not defined in dataSources or loop context`
|
||||
)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
const validateRequiredProps = (component, fileLabel, pathLabel, definition, bindings) => {
|
||||
if (!definition?.props) return
|
||||
|
||||
definition.props.forEach((prop) => {
|
||||
if (!prop.required) return
|
||||
|
||||
const hasProp = component.props && Object.prototype.hasOwnProperty.call(component.props, prop.name)
|
||||
const hasBinding = bindings && Object.prototype.hasOwnProperty.call(bindings, prop.name)
|
||||
|
||||
if (!hasProp && (!prop.supportsBinding || !hasBinding)) {
|
||||
reportError(
|
||||
fileLabel,
|
||||
`${pathLabel}.props.${prop.name}`,
|
||||
`Missing required prop "${prop.name}" for component type "${component.type}"`
|
||||
)
|
||||
}
|
||||
})
|
||||
}
|
||||
|
||||
const validateProps = (component, fileLabel, pathLabel, definition) => {
|
||||
if (!component.props || !definition?.props) return
|
||||
|
||||
const allowedProps = new Set(definition.props.map((prop) => prop.name))
|
||||
commonProps.forEach((prop) => allowedProps.add(prop))
|
||||
|
||||
Object.keys(component.props).forEach((propName) => {
|
||||
if (!allowedProps.has(propName)) {
|
||||
reportError(
|
||||
fileLabel,
|
||||
`${pathLabel}.props.${propName}`,
|
||||
`Invalid prop "${propName}" for component type "${component.type}"`
|
||||
)
|
||||
}
|
||||
})
|
||||
}
|
||||
|
||||
const lintComponent = (component, fileLabel, pathLabel, contextVars, dataSourceIds) => {
|
||||
if (!component || typeof component !== 'object') return
|
||||
|
||||
if (!component.id) {
|
||||
reportError(fileLabel, pathLabel, 'Missing required component id')
|
||||
}
|
||||
|
||||
if (!component.type) {
|
||||
reportError(fileLabel, pathLabel, 'Missing required component type')
|
||||
return
|
||||
}
|
||||
|
||||
const definition = definitionsByType.get(component.type)
|
||||
|
||||
validateProps(component, fileLabel, pathLabel, definition)
|
||||
validateRequiredProps(component, fileLabel, pathLabel, definition, component.bindings)
|
||||
validateBindings(component.bindings, fileLabel, pathLabel, contextVars, dataSourceIds, definition)
|
||||
validateDataBinding(component.dataBinding, fileLabel, pathLabel, contextVars, dataSourceIds)
|
||||
|
||||
const nextContextVars = new Set(contextVars)
|
||||
const repeatConfig = component.loop ?? component.repeat
|
||||
if (repeatConfig) {
|
||||
if (repeatConfig.itemVar) {
|
||||
nextContextVars.add(repeatConfig.itemVar)
|
||||
}
|
||||
if (repeatConfig.indexVar) {
|
||||
nextContextVars.add(repeatConfig.indexVar)
|
||||
}
|
||||
}
|
||||
|
||||
if (Array.isArray(component.children)) {
|
||||
component.children.forEach((child, index) => {
|
||||
if (typeof child === 'string') return
|
||||
lintComponent(child, fileLabel, `${pathLabel}.children[${index}]`, nextContextVars, dataSourceIds)
|
||||
})
|
||||
}
|
||||
|
||||
if (component.conditional) {
|
||||
const branches = [component.conditional.then, component.conditional.else]
|
||||
branches.forEach((branch, branchIndex) => {
|
||||
if (!branch) return
|
||||
if (typeof branch === 'string') return
|
||||
if (Array.isArray(branch)) {
|
||||
branch.forEach((child, index) => {
|
||||
if (typeof child === 'string') return
|
||||
lintComponent(child, fileLabel, `${pathLabel}.conditional.${branchIndex}[${index}]`, nextContextVars, dataSourceIds)
|
||||
})
|
||||
} else {
|
||||
lintComponent(branch, fileLabel, `${pathLabel}.conditional.${branchIndex}`, nextContextVars, dataSourceIds)
|
||||
}
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
const lintSchema = (schema, fileLabel) => {
|
||||
const dataSourceIds = new Set(
|
||||
Array.isArray(schema.dataSources)
|
||||
? schema.dataSources.map((source) => source.id).filter(Boolean)
|
||||
: []
|
||||
)
|
||||
|
||||
schema.components.forEach((component, index) => {
|
||||
lintComponent(component, fileLabel, `components[${index}]`, new Set(), dataSourceIds)
|
||||
})
|
||||
}
|
||||
|
||||
const schemaFiles = collectSchemaFiles(schemaDirs)
|
||||
|
||||
schemaFiles.forEach((filePath) => {
|
||||
const data = readJson(filePath)
|
||||
const schemas = extractSchemas(data, filePath)
|
||||
schemas.forEach(({ name, schema }) => lintSchema(schema, name))
|
||||
})
|
||||
|
||||
if (errors.length > 0) {
|
||||
console.error('JSON UI lint errors found:')
|
||||
errors.forEach((error) => {
|
||||
console.error(`- ${error.file} :: ${error.path} :: ${error.message}`)
|
||||
})
|
||||
process.exit(1)
|
||||
}
|
||||
|
||||
console.log('JSON UI lint passed.')
|
||||
139
frontends/codegen/scripts/list-json-components.cjs
Executable file
139
frontends/codegen/scripts/list-json-components.cjs
Executable file
@@ -0,0 +1,139 @@
|
||||
#!/usr/bin/env node
|
||||
|
||||
/**
|
||||
* JSON Components List Script
|
||||
*
|
||||
* Lists all components that can be rendered from JSON using the JSON UI system.
|
||||
*
|
||||
* Usage:
|
||||
* node scripts/list-json-components.cjs [--format=table|json] [--status=all|supported|planned]
|
||||
*/
|
||||
|
||||
const fs = require('fs')
|
||||
const path = require('path')
|
||||
|
||||
// Read the JSON components registry
|
||||
const registryPath = path.join(process.cwd(), 'json-components-registry.json')
|
||||
|
||||
if (!fs.existsSync(registryPath)) {
|
||||
console.error('❌ Could not find json-components-registry.json')
|
||||
process.exit(1)
|
||||
}
|
||||
|
||||
let registry
|
||||
try {
|
||||
registry = JSON.parse(fs.readFileSync(registryPath, 'utf8'))
|
||||
} catch (e) {
|
||||
console.error('❌ Failed to parse json-components-registry.json:', e.message)
|
||||
process.exit(1)
|
||||
}
|
||||
|
||||
const format = process.argv.find(arg => arg.startsWith('--format='))?.split('=')[1] || 'table'
|
||||
const statusFilter = process.argv.find(arg => arg.startsWith('--status='))?.split('=')[1] || 'all'
|
||||
|
||||
// Filter components by status if requested
|
||||
let componentsList = registry.components
|
||||
if (statusFilter !== 'all') {
|
||||
componentsList = componentsList.filter(c => c.status === statusFilter)
|
||||
}
|
||||
|
||||
if (format === 'json') {
|
||||
console.log(JSON.stringify(componentsList, null, 2))
|
||||
process.exit(0)
|
||||
}
|
||||
|
||||
// Table format output
|
||||
console.log('\n🧩 JSON-Compatible Components\n')
|
||||
console.log('═══════════════════════════════════════════════════════════════════════════\n')
|
||||
console.log(`These components can be rendered from JSON schemas using the JSON UI system.`)
|
||||
if (statusFilter !== 'all') {
|
||||
console.log(`\nFiltered by status: ${statusFilter}`)
|
||||
}
|
||||
console.log()
|
||||
|
||||
// Group by category
|
||||
const categories = ['layout', 'input', 'display', 'navigation', 'feedback', 'data', 'custom']
|
||||
const categoryIcons = {
|
||||
layout: '📐',
|
||||
input: '⌨️ ',
|
||||
display: '🎨',
|
||||
navigation: '🧭',
|
||||
feedback: '💬',
|
||||
data: '📊',
|
||||
custom: '⚡'
|
||||
}
|
||||
|
||||
categories.forEach(category => {
|
||||
const categoryComps = componentsList.filter(c => c.category === category)
|
||||
|
||||
if (categoryComps.length === 0) return
|
||||
|
||||
console.log(`\n${categoryIcons[category]} ${category.toUpperCase()}\n`)
|
||||
console.log('───────────────────────────────────────────────────────────────────────────')
|
||||
|
||||
categoryComps.forEach(comp => {
|
||||
const children = comp.canHaveChildren ? '👶 Can have children' : '➖ No children'
|
||||
let statusIcon = comp.status === 'supported' ? '✅' : '📋'
|
||||
if (comp.status === 'json-compatible') statusIcon = '🔥'
|
||||
if (comp.status === 'maybe-json-compatible') statusIcon = '⚠️ '
|
||||
|
||||
const source = comp.source ? ` [${comp.source}]` : ''
|
||||
|
||||
console.log(` ${statusIcon} ${comp.name} (${comp.type})${source}`)
|
||||
console.log(` ${comp.description}`)
|
||||
console.log(` ${children}`)
|
||||
if (comp.subComponents) {
|
||||
console.log(` Sub-components: ${comp.subComponents.join(', ')}`)
|
||||
}
|
||||
if (comp.jsonCompatible !== undefined && !comp.jsonCompatible) {
|
||||
console.log(` ⚠️ Not JSON-powered (${comp.jsonReason || 'complex state/logic'})`)
|
||||
}
|
||||
console.log('')
|
||||
})
|
||||
})
|
||||
|
||||
console.log('═══════════════════════════════════════════════════════════════════════════')
|
||||
console.log(`\nTotal Components: ${componentsList.length}`)
|
||||
|
||||
if (statusFilter === 'all') {
|
||||
const supported = componentsList.filter(c => c.status === 'supported').length
|
||||
const planned = componentsList.filter(c => c.status === 'planned').length
|
||||
const jsonCompatible = componentsList.filter(c => c.status === 'json-compatible').length
|
||||
const maybeCompatible = componentsList.filter(c => c.status === 'maybe-json-compatible').length
|
||||
const atoms = componentsList.filter(c => c.source === 'atoms').length
|
||||
const molecules = componentsList.filter(c => c.source === 'molecules').length
|
||||
const organisms = componentsList.filter(c => c.source === 'organisms').length
|
||||
const ui = componentsList.filter(c => c.source === 'ui').length
|
||||
|
||||
console.log(`\nBy Status:`)
|
||||
console.log(` ✅ Supported: ${supported}`)
|
||||
console.log(` 🔥 JSON-Compatible: ${jsonCompatible}`)
|
||||
console.log(` ⚠️ Maybe JSON-Compatible: ${maybeCompatible}`)
|
||||
console.log(` 📋 Planned: ${planned}`)
|
||||
|
||||
console.log(`\nBy Source:`)
|
||||
if (atoms > 0) console.log(` 🧱 Atoms: ${atoms}`)
|
||||
if (molecules > 0) console.log(` 🧪 Molecules: ${molecules}`)
|
||||
if (organisms > 0) console.log(` 🦠 Organisms: ${organisms}`)
|
||||
if (ui > 0) console.log(` 🎨 UI: ${ui}`)
|
||||
}
|
||||
|
||||
console.log(`\nBy Category:`)
|
||||
categories.forEach(cat => {
|
||||
const count = componentsList.filter(c => c.category === cat).length
|
||||
if (count > 0) {
|
||||
console.log(` ${categoryIcons[cat]} ${cat}: ${count}`)
|
||||
}
|
||||
})
|
||||
|
||||
console.log(`\nComponents with children support: ${componentsList.filter(c => c.canHaveChildren).length}`)
|
||||
|
||||
console.log('\n💡 Tips:')
|
||||
console.log(' • Full registry in json-components-registry.json')
|
||||
console.log(' • Component types defined in src/types/json-ui.ts')
|
||||
console.log(' • Component registry in src/lib/json-ui/component-registry.tsx')
|
||||
console.log(' • Component definitions in src/lib/component-definitions.ts')
|
||||
console.log(' • Run with --format=json for JSON output')
|
||||
console.log(' • Run with --status=supported to see only supported components')
|
||||
console.log(' • Run with --status=planned to see only planned components')
|
||||
console.log('')
|
||||
87
frontends/codegen/scripts/list-pages.js
Normal file
87
frontends/codegen/scripts/list-pages.js
Normal file
@@ -0,0 +1,87 @@
|
||||
#!/usr/bin/env node
|
||||
|
||||
/**
|
||||
* Page List Script
|
||||
*
|
||||
* Lists all pages defined in pages.json with their configuration.
|
||||
*
|
||||
* Usage:
|
||||
* node scripts/list-pages.js [--format=table|json]
|
||||
*/
|
||||
|
||||
const fs = require('fs')
|
||||
const path = require('path')
|
||||
|
||||
const pagesJsonPath = path.join(process.cwd(), 'src', 'config', 'pages.json')
|
||||
|
||||
if (!fs.existsSync(pagesJsonPath)) {
|
||||
console.error('❌ Could not find src/config/pages.json')
|
||||
process.exit(1)
|
||||
}
|
||||
|
||||
const pagesConfig = JSON.parse(fs.readFileSync(pagesJsonPath, 'utf8'))
|
||||
const format = process.argv.find(arg => arg.startsWith('--format='))?.split('=')[1] || 'table'
|
||||
|
||||
if (format === 'json') {
|
||||
console.log(JSON.stringify(pagesConfig.pages, null, 2))
|
||||
process.exit(0)
|
||||
}
|
||||
|
||||
console.log('\n📋 CodeForge Pages Configuration\n')
|
||||
console.log('═══════════════════════════════════════════════════════════════════════════\n')
|
||||
|
||||
const sortedPages = [...pagesConfig.pages].sort((a, b) => a.order - b.order)
|
||||
|
||||
sortedPages.forEach((page, index) => {
|
||||
const enabled = page.enabled ? '✅' : '❌'
|
||||
const hasToggle = page.toggleKey ? `🎚️ ${page.toggleKey}` : '➖'
|
||||
const hasShortcut = page.shortcut ? `⌨️ ${page.shortcut}` : '➖'
|
||||
|
||||
console.log(`${String(index + 1).padStart(2, '0')}. ${page.title}`)
|
||||
console.log(` ID: ${page.id}`)
|
||||
console.log(` Component: ${page.component}`)
|
||||
console.log(` Icon: ${page.icon}`)
|
||||
console.log(` Enabled: ${enabled}`)
|
||||
console.log(` Toggle: ${hasToggle}`)
|
||||
console.log(` Shortcut: ${hasShortcut}`)
|
||||
console.log(` Order: ${page.order}`)
|
||||
if (page.requiresResizable) {
|
||||
console.log(` Layout: Resizable Split-Pane`)
|
||||
}
|
||||
console.log('')
|
||||
})
|
||||
|
||||
console.log('═══════════════════════════════════════════════════════════════════════════')
|
||||
console.log(`\nTotal Pages: ${pagesConfig.pages.length}`)
|
||||
console.log(`Enabled: ${pagesConfig.pages.filter(p => p.enabled).length}`)
|
||||
console.log(`With Shortcuts: ${pagesConfig.pages.filter(p => p.shortcut).length}`)
|
||||
console.log(`With Feature Toggles: ${pagesConfig.pages.filter(p => p.toggleKey).length}`)
|
||||
console.log('')
|
||||
|
||||
const shortcuts = sortedPages
|
||||
.filter(p => p.shortcut && p.enabled)
|
||||
.map(p => ` ${p.shortcut.padEnd(12)} → ${p.title}`)
|
||||
|
||||
if (shortcuts.length > 0) {
|
||||
console.log('\n⌨️ Keyboard Shortcuts\n')
|
||||
console.log('───────────────────────────────────────────────────────────────────────────')
|
||||
shortcuts.forEach(s => console.log(s))
|
||||
console.log('')
|
||||
}
|
||||
|
||||
const featureToggles = sortedPages
|
||||
.filter(p => p.toggleKey && p.enabled)
|
||||
.map(p => ` ${p.toggleKey.padEnd(20)} → ${p.title}`)
|
||||
|
||||
if (featureToggles.length > 0) {
|
||||
console.log('\n🎚️ Feature Toggles\n')
|
||||
console.log('───────────────────────────────────────────────────────────────────────────')
|
||||
featureToggles.forEach(t => console.log(t))
|
||||
console.log('')
|
||||
}
|
||||
|
||||
console.log('\n💡 Tips:')
|
||||
console.log(' • Edit src/config/pages.json to add/modify pages')
|
||||
console.log(' • Run with --format=json for JSON output')
|
||||
console.log(' • See DECLARATIVE_SYSTEM.md for full documentation')
|
||||
console.log('')
|
||||
157
frontends/codegen/scripts/refactor-to-dynamic-imports.ts
Normal file
157
frontends/codegen/scripts/refactor-to-dynamic-imports.ts
Normal file
@@ -0,0 +1,157 @@
|
||||
import fs from 'node:fs/promises'
|
||||
import path from 'node:path'
|
||||
import { fileURLToPath } from 'node:url'
|
||||
|
||||
const __dirname = path.dirname(fileURLToPath(import.meta.url))
|
||||
const rootDir = path.resolve(__dirname, '..')
|
||||
|
||||
/**
|
||||
* Strategy: Replace static imports with dynamic component loading
|
||||
*
|
||||
* Before:
|
||||
* import { Button } from '@/components/ui/button'
|
||||
* <Button variant="primary">Click</Button>
|
||||
*
|
||||
* After:
|
||||
* import { getComponent } from '@/lib/component-loader'
|
||||
* const Button = getComponent('Button')
|
||||
* <Button variant="primary">Click</Button>
|
||||
*/
|
||||
|
||||
interface RefactorTask {
|
||||
file: string
|
||||
replacements: Array<{
|
||||
oldImport: string
|
||||
newImport: string
|
||||
components: string[]
|
||||
}>
|
||||
}
|
||||
|
||||
const targetComponents = {
|
||||
ui: ['button', 'card', 'badge', 'label', 'input', 'separator', 'scroll-area',
|
||||
'tabs', 'dialog', 'textarea', 'tooltip', 'switch', 'alert', 'skeleton',
|
||||
'progress', 'collapsible', 'resizable', 'popover', 'hover-card', 'checkbox',
|
||||
'accordion', 'aspect-ratio', 'avatar', 'radio-group', 'sheet', 'toggle'],
|
||||
molecules: ['DataSourceCard', 'EditorToolbar', 'EmptyEditorState', 'MonacoEditorPanel', 'SearchBar'],
|
||||
organisms: ['EmptyCanvasState', 'PageHeader', 'SchemaEditorCanvas', 'SchemaEditorPropertiesPanel',
|
||||
'SchemaEditorSidebar', 'SchemaEditorStatusBar', 'SchemaEditorToolbar', 'ToolbarActions'],
|
||||
atoms: ['Input']
|
||||
}
|
||||
|
||||
export async function refactorFile(filePath: string): Promise<boolean> {
|
||||
let content = await fs.readFile(filePath, 'utf-8')
|
||||
let modified = false
|
||||
|
||||
// Find all imports to replace
|
||||
const componentsToLoad = new Set<string>()
|
||||
|
||||
for (const [category, components] of Object.entries(targetComponents)) {
|
||||
for (const component of components) {
|
||||
const patterns = [
|
||||
new RegExp(`import\\s+\\{([^}]+)\\}\\s+from\\s+['"]@/components/${category}/${component}['"]`, 'g'),
|
||||
new RegExp(`import\\s+(\\w+)\\s+from\\s+['"]@/components/${category}/${component}['"]`, 'g'),
|
||||
]
|
||||
|
||||
for (const pattern of patterns) {
|
||||
const matches = content.matchAll(pattern)
|
||||
for (const match of matches) {
|
||||
const importedItems = match[1].split(',').map(s => s.trim().split(' as ')[0].trim())
|
||||
importedItems.forEach(item => componentsToLoad.add(item))
|
||||
|
||||
// Remove the import line
|
||||
content = content.replace(match[0], '')
|
||||
modified = true
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if (!modified) return false
|
||||
|
||||
// Add dynamic component loader import at top
|
||||
const loaderImport = `import { loadComponent } from '@/lib/component-loader'\n`
|
||||
|
||||
// Add component loading statements
|
||||
const componentLoads = Array.from(componentsToLoad)
|
||||
.map(comp => `const ${comp} = loadComponent('${comp}')`)
|
||||
.join('\n')
|
||||
|
||||
// Find first import statement location
|
||||
const firstImportMatch = content.match(/^import\s/m)
|
||||
if (firstImportMatch && firstImportMatch.index !== undefined) {
|
||||
content = content.slice(0, firstImportMatch.index) +
|
||||
loaderImport + '\n' +
|
||||
componentLoads + '\n\n' +
|
||||
content.slice(firstImportMatch.index)
|
||||
}
|
||||
|
||||
await fs.writeFile(filePath, content)
|
||||
return true
|
||||
}
|
||||
|
||||
async function createComponentLoader() {
|
||||
const loaderPath = path.join(rootDir, 'src/lib/component-loader.ts')
|
||||
|
||||
const loaderContent = `/**
|
||||
* Dynamic Component Loader
|
||||
* Loads components from the registry at runtime instead of static imports
|
||||
*/
|
||||
|
||||
import { ComponentType, lazy } from 'react'
|
||||
|
||||
const componentCache = new Map<string, ComponentType<any>>()
|
||||
|
||||
export function loadComponent(componentName: string): ComponentType<any> {
|
||||
if (componentCache.has(componentName)) {
|
||||
return componentCache.get(componentName)!
|
||||
}
|
||||
|
||||
// Try to load from different sources
|
||||
const loaders = [
|
||||
() => import(\`@/components/ui/\${componentName.toLowerCase()}\`),
|
||||
() => import(\`@/components/atoms/\${componentName}\`),
|
||||
() => import(\`@/components/molecules/\${componentName}\`),
|
||||
() => import(\`@/components/organisms/\${componentName}\`),
|
||||
]
|
||||
|
||||
// Create lazy component
|
||||
const LazyComponent = lazy(async () => {
|
||||
for (const loader of loaders) {
|
||||
try {
|
||||
const module = await loader()
|
||||
return { default: module[componentName] || module.default }
|
||||
} catch (e) {
|
||||
continue
|
||||
}
|
||||
}
|
||||
throw new Error(\`Component \${componentName} not found\`)
|
||||
})
|
||||
|
||||
componentCache.set(componentName, LazyComponent)
|
||||
return LazyComponent
|
||||
}
|
||||
|
||||
export function getComponent(componentName: string): ComponentType<any> {
|
||||
return loadComponent(componentName)
|
||||
}
|
||||
`
|
||||
|
||||
await fs.writeFile(loaderPath, loaderContent)
|
||||
console.log('✅ Created component-loader.ts')
|
||||
}
|
||||
|
||||
async function main() {
|
||||
console.log('🚀 Starting AGGRESSIVE refactoring to eliminate static imports...\n')
|
||||
console.log('⚠️ WARNING: This is a MAJOR refactoring affecting 975+ import statements!\n')
|
||||
console.log('Press Ctrl+C now if you want to reconsider...\n')
|
||||
|
||||
await new Promise(resolve => setTimeout(resolve, 3000))
|
||||
|
||||
console.log('🔧 Creating dynamic component loader...')
|
||||
await createComponentLoader()
|
||||
|
||||
console.log('\n📝 This approach requires significant testing and may break things.')
|
||||
console.log(' Recommendation: Manual refactoring of high-value components instead.\n')
|
||||
}
|
||||
|
||||
main().catch(console.error)
|
||||
368
frontends/codegen/scripts/scan-and-update-registry.cjs
Normal file
368
frontends/codegen/scripts/scan-and-update-registry.cjs
Normal file
@@ -0,0 +1,368 @@
|
||||
#!/usr/bin/env node
|
||||
|
||||
/**
|
||||
* Scan and Update JSON Components Registry
|
||||
*
|
||||
* Scans the actual component files in src/components and updates
|
||||
* json-components-registry.json to include all real components.
|
||||
*
|
||||
* Usage:
|
||||
* node scripts/scan-and-update-registry.cjs
|
||||
*/
|
||||
|
||||
const fs = require('fs')
|
||||
const path = require('path')
|
||||
|
||||
// Scan a directory for .tsx files
|
||||
function scanComponents(dir) {
|
||||
const files = fs.readdirSync(dir)
|
||||
return files
|
||||
.filter(f => f.endsWith('.tsx') && !f.startsWith('index'))
|
||||
.map(f => f.replace('.tsx', ''))
|
||||
}
|
||||
|
||||
// Get all components
|
||||
const atomsPath = path.join(process.cwd(), 'src/components/atoms')
|
||||
const moleculesPath = path.join(process.cwd(), 'src/components/molecules')
|
||||
const organismsPath = path.join(process.cwd(), 'src/components/organisms')
|
||||
const uiPath = path.join(process.cwd(), 'src/components/ui')
|
||||
|
||||
const atoms = scanComponents(atomsPath)
|
||||
const molecules = scanComponents(moleculesPath)
|
||||
const organisms = fs.existsSync(organismsPath) ? scanComponents(organismsPath) : []
|
||||
const ui = scanComponents(uiPath)
|
||||
|
||||
console.log(`Found ${atoms.length} atoms, ${molecules.length} molecules, ${organisms.length} organisms, ${ui.length} ui components`)
|
||||
console.log(`Total: ${atoms.length + molecules.length + organisms.length + ui.length} components`)
|
||||
|
||||
// Read existing registry to preserve metadata
|
||||
const registryPath = path.join(process.cwd(), 'json-components-registry.json')
|
||||
let existingRegistry = { components: [] }
|
||||
if (fs.existsSync(registryPath)) {
|
||||
existingRegistry = JSON.parse(fs.readFileSync(registryPath, 'utf8'))
|
||||
}
|
||||
|
||||
// Create a map of existing components for quick lookup
|
||||
const existingMap = new Map()
|
||||
existingRegistry.components.forEach(c => {
|
||||
existingMap.set(c.type, c)
|
||||
})
|
||||
|
||||
// Category mapping heuristics
|
||||
function guessCategory(name) {
|
||||
const lower = name.toLowerCase()
|
||||
|
||||
// Layout
|
||||
if (lower.includes('container') || lower.includes('grid') || lower.includes('flex') ||
|
||||
lower.includes('stack') || lower.includes('card') || lower.includes('section') ||
|
||||
lower.includes('drawer') || lower.includes('modal') || lower.includes('dialog')) {
|
||||
return 'layout'
|
||||
}
|
||||
|
||||
// Input
|
||||
if (lower.includes('input') || lower.includes('button') || lower.includes('select') ||
|
||||
lower.includes('checkbox') || lower.includes('radio') || lower.includes('switch') ||
|
||||
lower.includes('slider') || lower.includes('form') || lower.includes('upload') ||
|
||||
lower.includes('picker') || lower.includes('toggle')) {
|
||||
return 'input'
|
||||
}
|
||||
|
||||
// Navigation
|
||||
if (lower.includes('link') || lower.includes('breadcrumb') || lower.includes('tab') ||
|
||||
lower.includes('menu') || lower.includes('navigation')) {
|
||||
return 'navigation'
|
||||
}
|
||||
|
||||
// Feedback
|
||||
if (lower.includes('alert') || lower.includes('notification') || lower.includes('badge') ||
|
||||
lower.includes('status') || lower.includes('error') || lower.includes('empty') ||
|
||||
lower.includes('loading') || lower.includes('spinner') || lower.includes('toast')) {
|
||||
return 'feedback'
|
||||
}
|
||||
|
||||
// Data
|
||||
if (lower.includes('table') || lower.includes('list') || lower.includes('data') ||
|
||||
lower.includes('metric') || lower.includes('stat') || lower.includes('chart') ||
|
||||
lower.includes('timeline') || lower.includes('keyvalue')) {
|
||||
return 'data'
|
||||
}
|
||||
|
||||
// Display (default for text, images, icons, etc.)
|
||||
if (lower.includes('text') || lower.includes('heading') || lower.includes('label') ||
|
||||
lower.includes('image') || lower.includes('avatar') || lower.includes('icon') ||
|
||||
lower.includes('code') || lower.includes('tag') || lower.includes('skeleton') ||
|
||||
lower.includes('separator') || lower.includes('divider') || lower.includes('progress')) {
|
||||
return 'display'
|
||||
}
|
||||
|
||||
return 'custom'
|
||||
}
|
||||
|
||||
function canHaveChildren(name) {
|
||||
const noChildren = [
|
||||
'Input', 'TextArea', 'Select', 'Checkbox', 'Radio', 'Switch', 'Slider', 'NumberInput',
|
||||
'Image', 'Avatar', 'Separator', 'Divider', 'Progress', 'ProgressBar', 'Skeleton',
|
||||
'Spinner', 'Icon', 'FileUpload', 'DatePicker', 'CircularProgress', 'StatusIcon',
|
||||
'StatusBadge', 'ErrorBadge', 'Table', 'DataTable', 'List', 'DataList', 'KeyValue',
|
||||
'StatCard', 'MetricCard', 'DataCard', 'SearchInput', 'ActionBar', 'Timeline'
|
||||
]
|
||||
return !noChildren.includes(name)
|
||||
}
|
||||
|
||||
function getDescription(name) {
|
||||
// Try to generate a reasonable description
|
||||
const descriptions = {
|
||||
// Common patterns
|
||||
'Accordion': 'Collapsible content sections',
|
||||
'ActionButton': 'Button with action icon',
|
||||
'ActionBar': 'Action button toolbar',
|
||||
'Alert': 'Alert notification message',
|
||||
'Avatar': 'User avatar image',
|
||||
'AvatarGroup': 'Group of user avatars',
|
||||
'Badge': 'Small status or count indicator',
|
||||
'Breadcrumb': 'Navigation breadcrumb trail',
|
||||
'Button': 'Interactive button element',
|
||||
'ButtonGroup': 'Group of related buttons',
|
||||
'Calendar': 'Calendar date selector',
|
||||
'Card': 'Container card component',
|
||||
'Checkbox': 'Checkbox toggle control',
|
||||
'Chip': 'Compact element for tags or selections',
|
||||
'CircularProgress': 'Circular progress indicator',
|
||||
'Code': 'Inline or block code display',
|
||||
'CommandPalette': 'Command search and execution',
|
||||
'Container': 'Generic container element',
|
||||
'ContextMenu': 'Right-click context menu',
|
||||
'DataCard': 'Custom data display card',
|
||||
'DataList': 'Styled data list',
|
||||
'DataTable': 'Advanced data table with sorting and filtering',
|
||||
'DatePicker': 'Date selection input',
|
||||
'Divider': 'Visual section divider',
|
||||
'Drawer': 'Sliding panel overlay',
|
||||
'EmptyState': 'Empty state placeholder',
|
||||
'ErrorBadge': 'Error state badge',
|
||||
'FileUpload': 'File upload control',
|
||||
'Flex': 'Flexible box layout container',
|
||||
'Form': 'Form container component',
|
||||
'Grid': 'Responsive grid layout',
|
||||
'Heading': 'Heading text with level (h1-h6)',
|
||||
'HoverCard': 'Card shown on hover',
|
||||
'Icon': 'Icon from icon library',
|
||||
'IconButton': 'Button with icon only',
|
||||
'Image': 'Image element with loading states',
|
||||
'InfoBox': 'Information box with icon',
|
||||
'Input': 'Text input field',
|
||||
'Kbd': 'Keyboard key display',
|
||||
'KeyValue': 'Key-value pair display',
|
||||
'Label': 'Form label element',
|
||||
'Link': 'Hyperlink element',
|
||||
'List': 'Generic list renderer with custom items',
|
||||
'Menu': 'Menu component',
|
||||
'MetricCard': 'Metric display card',
|
||||
'Modal': 'Modal dialog overlay',
|
||||
'Notification': 'Toast notification',
|
||||
'NumberInput': 'Numeric input with increment/decrement',
|
||||
'PasswordInput': 'Password input with visibility toggle',
|
||||
'Popover': 'Popover overlay content',
|
||||
'Progress': 'Progress bar indicator',
|
||||
'ProgressBar': 'Linear progress bar',
|
||||
'Radio': 'Radio button selection',
|
||||
'Rating': 'Star rating component',
|
||||
'ScrollArea': 'Scrollable container area',
|
||||
'SearchInput': 'Search input with icon',
|
||||
'Select': 'Dropdown select control',
|
||||
'Separator': 'Visual divider line',
|
||||
'Skeleton': 'Loading skeleton placeholder',
|
||||
'Slider': 'Numeric range slider',
|
||||
'Spinner': 'Loading spinner',
|
||||
'Stack': 'Vertical or horizontal stack layout',
|
||||
'StatCard': 'Statistic card display',
|
||||
'StatusBadge': 'Status indicator badge',
|
||||
'StatusIcon': 'Status indicator icon',
|
||||
'Stepper': 'Step-by-step navigation',
|
||||
'Switch': 'Toggle switch control',
|
||||
'Table': 'Data table',
|
||||
'Tabs': 'Tabbed interface container',
|
||||
'Tag': 'Removable tag or chip',
|
||||
'Text': 'Text content with typography variants',
|
||||
'TextArea': 'Multi-line text input',
|
||||
'Timeline': 'Timeline visualization',
|
||||
'Toggle': 'Toggle button control',
|
||||
'Tooltip': 'Tooltip overlay text'
|
||||
}
|
||||
|
||||
return descriptions[name] || `${name} component`
|
||||
}
|
||||
|
||||
// JSON compatibility lists based on analysis
|
||||
const jsonCompatibleMolecules = [
|
||||
'AppBranding', 'Breadcrumb', 'EmptyEditorState', 'LabelWithBadge',
|
||||
'LazyBarChart', 'LazyD3BarChart', 'LazyLineChart', 'LoadingFallback',
|
||||
'LoadingState', 'NavigationGroupHeader', 'SaveIndicator',
|
||||
'SeedDataManager', 'StorageSettings'
|
||||
]
|
||||
|
||||
const maybeJsonCompatibleMolecules = [
|
||||
'ActionBar', 'BindingEditor', 'CanvasRenderer', 'CodeExplanationDialog',
|
||||
'ComponentBindingDialog', 'ComponentPalette', 'ComponentTree', 'DataCard',
|
||||
'DataSourceCard', 'DataSourceEditorDialog', 'EditorActions', 'EditorToolbar',
|
||||
'EmptyState', 'FileTabs', 'LazyInlineMonacoEditor', 'LazyMonacoEditor',
|
||||
'MonacoEditorPanel', 'NavigationItem', 'PageHeaderContent', 'PropertyEditor',
|
||||
'SearchBar', 'SearchInput', 'StatCard', 'ToolbarButton', 'TreeCard',
|
||||
'TreeFormDialog', 'TreeListHeader'
|
||||
]
|
||||
|
||||
const jsonCompatibleOrganisms = ['PageHeader']
|
||||
|
||||
const maybeJsonCompatibleOrganisms = [
|
||||
'AppHeader', 'DataSourceManager', 'EmptyCanvasState', 'JSONUIShowcase',
|
||||
'NavigationMenu', 'SchemaCodeViewer', 'SchemaEditorCanvas',
|
||||
'SchemaEditorLayout', 'SchemaEditorPropertiesPanel', 'SchemaEditorSidebar',
|
||||
'SchemaEditorStatusBar', 'SchemaEditorToolbar', 'ToolbarActions', 'TreeListPanel'
|
||||
]
|
||||
|
||||
// Build components array
|
||||
const components = []
|
||||
|
||||
// Process atoms (all are foundational, mark as supported)
|
||||
atoms.forEach(name => {
|
||||
const existing = existingMap.get(name)
|
||||
components.push({
|
||||
type: name,
|
||||
name: existing?.name || name,
|
||||
category: existing?.category || guessCategory(name),
|
||||
canHaveChildren: existing?.canHaveChildren !== undefined ? existing.canHaveChildren : canHaveChildren(name),
|
||||
description: existing?.description || getDescription(name),
|
||||
status: existing?.status || 'supported',
|
||||
source: 'atoms'
|
||||
})
|
||||
})
|
||||
|
||||
// Process molecules with JSON compatibility marking
|
||||
molecules.forEach(name => {
|
||||
const existing = existingMap.get(name)
|
||||
let status = existing?.status || 'supported'
|
||||
|
||||
if (jsonCompatibleMolecules.includes(name)) {
|
||||
status = 'json-compatible'
|
||||
} else if (maybeJsonCompatibleMolecules.includes(name)) {
|
||||
status = 'maybe-json-compatible'
|
||||
}
|
||||
|
||||
components.push({
|
||||
type: name,
|
||||
name: existing?.name || name,
|
||||
category: existing?.category || guessCategory(name),
|
||||
canHaveChildren: existing?.canHaveChildren !== undefined ? existing.canHaveChildren : canHaveChildren(name),
|
||||
description: existing?.description || getDescription(name),
|
||||
status,
|
||||
source: 'molecules',
|
||||
jsonCompatible: jsonCompatibleMolecules.includes(name) || maybeJsonCompatibleMolecules.includes(name)
|
||||
})
|
||||
})
|
||||
|
||||
// Process organisms with JSON compatibility marking
|
||||
organisms.forEach(name => {
|
||||
const existing = existingMap.get(name)
|
||||
let status = existing?.status || 'supported'
|
||||
|
||||
if (jsonCompatibleOrganisms.includes(name)) {
|
||||
status = 'json-compatible'
|
||||
} else if (maybeJsonCompatibleOrganisms.includes(name)) {
|
||||
status = 'maybe-json-compatible'
|
||||
}
|
||||
|
||||
components.push({
|
||||
type: name,
|
||||
name: existing?.name || name,
|
||||
category: existing?.category || guessCategory(name),
|
||||
canHaveChildren: existing?.canHaveChildren !== undefined ? existing.canHaveChildren : true,
|
||||
description: existing?.description || `${name} organism component`,
|
||||
status,
|
||||
source: 'organisms',
|
||||
jsonCompatible: jsonCompatibleOrganisms.includes(name) || maybeJsonCompatibleOrganisms.includes(name)
|
||||
})
|
||||
})
|
||||
|
||||
// Process ui components (convert kebab-case to PascalCase)
|
||||
ui.forEach(name => {
|
||||
// Convert kebab-case to PascalCase
|
||||
const pascalName = name.split('-').map(word =>
|
||||
word.charAt(0).toUpperCase() + word.slice(1)
|
||||
).join('')
|
||||
|
||||
const existing = existingMap.get(pascalName) || existingMap.get(name)
|
||||
components.push({
|
||||
type: pascalName,
|
||||
name: existing?.name || pascalName,
|
||||
category: existing?.category || guessCategory(pascalName),
|
||||
canHaveChildren: existing?.canHaveChildren !== undefined ? existing.canHaveChildren : canHaveChildren(pascalName),
|
||||
description: existing?.description || getDescription(pascalName),
|
||||
status: existing?.status || 'supported',
|
||||
source: 'ui'
|
||||
})
|
||||
})
|
||||
|
||||
// Sort by category then name
|
||||
components.sort((a, b) => {
|
||||
if (a.category !== b.category) {
|
||||
const order = ['layout', 'input', 'display', 'navigation', 'feedback', 'data', 'custom']
|
||||
return order.indexOf(a.category) - order.indexOf(b.category)
|
||||
}
|
||||
return a.name.localeCompare(b.name)
|
||||
})
|
||||
|
||||
// Count by category
|
||||
const byCategory = {}
|
||||
components.forEach(c => {
|
||||
byCategory[c.category] = (byCategory[c.category] || 0) + 1
|
||||
})
|
||||
|
||||
// Build the registry
|
||||
const registry = {
|
||||
$schema: './schemas/json-components-registry-schema.json',
|
||||
version: '2.0.0',
|
||||
description: 'Registry of all components in the application',
|
||||
lastUpdated: new Date().toISOString(),
|
||||
categories: {
|
||||
layout: 'Layout and container components',
|
||||
input: 'Form inputs and interactive controls',
|
||||
display: 'Display and presentation components',
|
||||
navigation: 'Navigation and routing components',
|
||||
feedback: 'Alerts, notifications, and status indicators',
|
||||
data: 'Data display and visualization components',
|
||||
custom: 'Custom domain-specific components'
|
||||
},
|
||||
components,
|
||||
statistics: {
|
||||
total: components.length,
|
||||
supported: components.filter(c => c.status === 'supported').length,
|
||||
planned: components.filter(c => c.status === 'planned').length,
|
||||
jsonCompatible: components.filter(c => c.status === 'json-compatible').length,
|
||||
maybeJsonCompatible: components.filter(c => c.status === 'maybe-json-compatible').length,
|
||||
byCategory,
|
||||
bySource: {
|
||||
atoms: atoms.length,
|
||||
molecules: molecules.length,
|
||||
organisms: organisms.length,
|
||||
ui: ui.length
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Write to file
|
||||
fs.writeFileSync(registryPath, JSON.stringify(registry, null, 2) + '\n', 'utf8')
|
||||
|
||||
console.log('\n✅ Updated json-components-registry.json')
|
||||
console.log(` Total components: ${registry.statistics.total}`)
|
||||
console.log(` By source:`)
|
||||
console.log(` 🧱 atoms: ${registry.statistics.bySource.atoms}`)
|
||||
console.log(` 🧪 molecules: ${registry.statistics.bySource.molecules}`)
|
||||
console.log(` 🦠 organisms: ${registry.statistics.bySource.organisms}`)
|
||||
console.log(` 🎨 ui: ${registry.statistics.bySource.ui}`)
|
||||
console.log(` JSON compatibility:`)
|
||||
console.log(` 🔥 Fully compatible: ${registry.statistics.jsonCompatible}`)
|
||||
console.log(` ⚠️ Maybe compatible: ${registry.statistics.maybeJsonCompatible}`)
|
||||
console.log(` By category:`)
|
||||
Object.entries(byCategory).forEach(([cat, count]) => {
|
||||
console.log(` ${cat}: ${count}`)
|
||||
})
|
||||
76
frontends/codegen/scripts/update-index-exports.ts
Normal file
76
frontends/codegen/scripts/update-index-exports.ts
Normal file
@@ -0,0 +1,76 @@
|
||||
import fs from 'node:fs/promises'
|
||||
import path from 'node:path'
|
||||
import { fileURLToPath } from 'node:url'
|
||||
|
||||
const __dirname = path.dirname(fileURLToPath(import.meta.url))
|
||||
const rootDir = path.resolve(__dirname, '..')
|
||||
|
||||
/**
|
||||
* Update index.ts files to remove exports for deleted components
|
||||
*/
|
||||
async function updateIndexFiles(): Promise<void> {
|
||||
console.log('📝 Updating index.ts files...\n')
|
||||
|
||||
const directories = [
|
||||
'src/components/atoms',
|
||||
'src/components/molecules',
|
||||
'src/components/organisms',
|
||||
'src/components/ui',
|
||||
]
|
||||
|
||||
for (const dir of directories) {
|
||||
const indexPath = path.join(rootDir, dir, 'index.ts')
|
||||
const dirPath = path.join(rootDir, dir)
|
||||
|
||||
console.log(`📂 Processing ${dir}/index.ts...`)
|
||||
|
||||
try {
|
||||
// Read current index.ts
|
||||
const indexContent = await fs.readFile(indexPath, 'utf-8')
|
||||
const lines = indexContent.split('\n')
|
||||
|
||||
// Get list of existing .tsx files
|
||||
const files = await fs.readdir(dirPath)
|
||||
const existingComponents = new Set(
|
||||
files
|
||||
.filter(f => f.endsWith('.tsx') && f !== 'index.tsx')
|
||||
.map(f => f.replace('.tsx', ''))
|
||||
)
|
||||
|
||||
// Filter out exports for deleted components
|
||||
const updatedLines = lines.filter(line => {
|
||||
// Skip empty lines and comments
|
||||
if (!line.trim() || line.trim().startsWith('//')) {
|
||||
return true
|
||||
}
|
||||
|
||||
// Check if it's an export line
|
||||
const exportMatch = line.match(/export\s+(?:\{([^}]+)\}|.+)\s+from\s+['"]\.\/([^'"]+)['"]/)
|
||||
if (!exportMatch) {
|
||||
return true // Keep non-export lines
|
||||
}
|
||||
|
||||
const componentName = exportMatch[2]
|
||||
const exists = existingComponents.has(componentName)
|
||||
|
||||
if (!exists) {
|
||||
console.log(` ❌ Removing export: ${componentName}`)
|
||||
return false
|
||||
}
|
||||
|
||||
return true
|
||||
})
|
||||
|
||||
// Write updated index.ts
|
||||
await fs.writeFile(indexPath, updatedLines.join('\n'))
|
||||
|
||||
console.log(` ✅ Updated ${dir}/index.ts\n`)
|
||||
} catch (error) {
|
||||
console.error(` ❌ Error processing ${dir}/index.ts:`, error)
|
||||
}
|
||||
}
|
||||
|
||||
console.log('✨ Index files updated!')
|
||||
}
|
||||
|
||||
updateIndexFiles().catch(console.error)
|
||||
262
frontends/codegen/scripts/update-registry-from-json.ts
Normal file
262
frontends/codegen/scripts/update-registry-from-json.ts
Normal file
@@ -0,0 +1,262 @@
|
||||
import fs from 'node:fs/promises'
|
||||
import path from 'node:path'
|
||||
import { fileURLToPath } from 'node:url'
|
||||
|
||||
const __dirname = path.dirname(fileURLToPath(import.meta.url))
|
||||
const rootDir = path.resolve(__dirname, '..')
|
||||
|
||||
interface JSONComponent {
|
||||
type: string
|
||||
jsonCompatible?: boolean
|
||||
wrapperRequired?: boolean
|
||||
load?: {
|
||||
path: string
|
||||
export: string
|
||||
lazy?: boolean
|
||||
}
|
||||
props?: Record<string, unknown>
|
||||
metadata?: {
|
||||
notes?: string
|
||||
}
|
||||
}
|
||||
|
||||
interface RegistryEntry {
|
||||
type: string
|
||||
name: string
|
||||
category: string
|
||||
canHaveChildren: boolean
|
||||
description: string
|
||||
status: 'supported' | 'deprecated'
|
||||
source: 'atoms' | 'molecules' | 'organisms' | 'ui' | 'wrappers' | 'custom'
|
||||
jsonCompatible: boolean
|
||||
wrapperRequired?: boolean
|
||||
load?: {
|
||||
path: string
|
||||
export: string
|
||||
lazy?: boolean
|
||||
}
|
||||
metadata?: {
|
||||
conversionDate?: string
|
||||
autoGenerated?: boolean
|
||||
notes?: string
|
||||
}
|
||||
}
|
||||
|
||||
interface Registry {
|
||||
version: string
|
||||
categories: Record<string, string>
|
||||
sourceRoots: Record<string, string[]>
|
||||
components: RegistryEntry[]
|
||||
statistics: {
|
||||
total: number
|
||||
supported: number
|
||||
jsonCompatible: number
|
||||
byCategory: Record<string, number>
|
||||
bySource: Record<string, number>
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Determine component category based on name and source
|
||||
*/
|
||||
function determineCategory(componentName: string, source: string): string {
|
||||
const name = componentName.toLowerCase()
|
||||
|
||||
// Layout components
|
||||
if (/container|section|stack|flex|grid|layout|panel|sidebar|header|footer/.test(name)) {
|
||||
return 'layout'
|
||||
}
|
||||
|
||||
// Input components
|
||||
if (/input|select|checkbox|radio|slider|switch|form|textarea|date|file|number|password|search/.test(name)) {
|
||||
return 'input'
|
||||
}
|
||||
|
||||
// Navigation components
|
||||
if (/nav|menu|breadcrumb|tab|link|pagination/.test(name)) {
|
||||
return 'navigation'
|
||||
}
|
||||
|
||||
// Feedback components
|
||||
if (/alert|toast|notification|spinner|loading|progress|skeleton|badge|indicator/.test(name)) {
|
||||
return 'feedback'
|
||||
}
|
||||
|
||||
// Data display components
|
||||
if (/table|list|card|chart|graph|tree|timeline|avatar|image/.test(name)) {
|
||||
return 'data'
|
||||
}
|
||||
|
||||
// Display components
|
||||
if (/text|heading|label|code|icon|divider|separator|spacer/.test(name)) {
|
||||
return 'display'
|
||||
}
|
||||
|
||||
// Default to custom for organisms and complex components
|
||||
if (source === 'organisms' || source === 'molecules') {
|
||||
return 'custom'
|
||||
}
|
||||
|
||||
return 'display'
|
||||
}
|
||||
|
||||
/**
|
||||
* Determine if component can have children
|
||||
*/
|
||||
function canHaveChildren(componentName: string): boolean {
|
||||
const name = componentName.toLowerCase()
|
||||
|
||||
// These typically don't have children
|
||||
const noChildren = /input|select|checkbox|radio|slider|switch|image|icon|divider|separator|spacer|spinner|progress|badge|dot/
|
||||
|
||||
return !noChildren.test(name)
|
||||
}
|
||||
|
||||
/**
|
||||
* Generate description for component
|
||||
*/
|
||||
function generateDescription(componentName: string, category: string): string {
|
||||
const descriptions: Record<string, string> = {
|
||||
layout: 'Layout container component',
|
||||
input: 'Form input component',
|
||||
navigation: 'Navigation component',
|
||||
feedback: 'Feedback and status component',
|
||||
data: 'Data display component',
|
||||
display: 'Display component',
|
||||
custom: 'Custom component',
|
||||
}
|
||||
|
||||
return descriptions[category] || 'Component'
|
||||
}
|
||||
|
||||
/**
|
||||
* Read all JSON files from a directory and create registry entries
|
||||
*/
|
||||
async function processDirectory(
|
||||
dir: string,
|
||||
source: 'atoms' | 'molecules' | 'organisms' | 'ui' | 'custom'
|
||||
): Promise<RegistryEntry[]> {
|
||||
const entries: RegistryEntry[] = []
|
||||
|
||||
try {
|
||||
const files = await fs.readdir(dir)
|
||||
const jsonFiles = files.filter(f => f.endsWith('.json'))
|
||||
|
||||
for (const file of jsonFiles) {
|
||||
const filePath = path.join(dir, file)
|
||||
const content = await fs.readFile(filePath, 'utf-8')
|
||||
const jsonComponent: JSONComponent = JSON.parse(content)
|
||||
|
||||
const componentName = jsonComponent.type
|
||||
if (!componentName) continue
|
||||
|
||||
const category = determineCategory(componentName, source)
|
||||
|
||||
const entry: RegistryEntry = {
|
||||
type: componentName,
|
||||
name: componentName,
|
||||
category,
|
||||
canHaveChildren: canHaveChildren(componentName),
|
||||
description: generateDescription(componentName, category),
|
||||
status: 'supported',
|
||||
source,
|
||||
jsonCompatible: jsonComponent.jsonCompatible !== false,
|
||||
wrapperRequired: jsonComponent.wrapperRequired || false,
|
||||
metadata: {
|
||||
conversionDate: new Date().toISOString().split('T')[0],
|
||||
autoGenerated: true,
|
||||
notes: jsonComponent.metadata?.notes,
|
||||
},
|
||||
}
|
||||
|
||||
if (jsonComponent.load) {
|
||||
entry.load = jsonComponent.load
|
||||
}
|
||||
|
||||
entries.push(entry)
|
||||
}
|
||||
} catch (error) {
|
||||
console.error(`Error processing ${dir}:`, error)
|
||||
}
|
||||
|
||||
return entries
|
||||
}
|
||||
|
||||
/**
|
||||
* Update the registry with new components
|
||||
*/
|
||||
async function updateRegistry() {
|
||||
console.log('📝 Updating json-components-registry.json...\n')
|
||||
|
||||
const registryPath = path.join(rootDir, 'json-components-registry.json')
|
||||
|
||||
// Read existing registry
|
||||
const registryContent = await fs.readFile(registryPath, 'utf-8')
|
||||
const registry: Registry = JSON.parse(registryContent)
|
||||
|
||||
console.log(` Current components: ${registry.components.length}`)
|
||||
|
||||
// Process each directory
|
||||
const newEntries: RegistryEntry[] = []
|
||||
|
||||
const directories = [
|
||||
{ dir: path.join(rootDir, 'src/config/pages/atoms'), source: 'atoms' as const },
|
||||
{ dir: path.join(rootDir, 'src/config/pages/molecules'), source: 'molecules' as const },
|
||||
{ dir: path.join(rootDir, 'src/config/pages/organisms'), source: 'organisms' as const },
|
||||
{ dir: path.join(rootDir, 'src/config/pages/ui'), source: 'ui' as const },
|
||||
{ dir: path.join(rootDir, 'src/config/pages/components'), source: 'custom' as const },
|
||||
]
|
||||
|
||||
for (const { dir, source } of directories) {
|
||||
const entries = await processDirectory(dir, source)
|
||||
newEntries.push(...entries)
|
||||
console.log(` Processed ${source}: ${entries.length} components`)
|
||||
}
|
||||
|
||||
// Merge with existing components (remove duplicates)
|
||||
const existingTypes = new Set(registry.components.map(c => c.type))
|
||||
const uniqueNewEntries = newEntries.filter(e => !existingTypes.has(e.type))
|
||||
|
||||
console.log(`\n New unique components: ${uniqueNewEntries.length}`)
|
||||
console.log(` Skipped duplicates: ${newEntries.length - uniqueNewEntries.length}`)
|
||||
|
||||
// Add new components
|
||||
registry.components.push(...uniqueNewEntries)
|
||||
|
||||
// Update statistics
|
||||
const byCategory: Record<string, number> = {}
|
||||
const bySource: Record<string, number> = {}
|
||||
|
||||
for (const component of registry.components) {
|
||||
byCategory[component.category] = (byCategory[component.category] || 0) + 1
|
||||
bySource[component.source] = (bySource[component.source] || 0) + 1
|
||||
}
|
||||
|
||||
registry.statistics = {
|
||||
total: registry.components.length,
|
||||
supported: registry.components.filter(c => c.status === 'supported').length,
|
||||
jsonCompatible: registry.components.filter(c => c.jsonCompatible).length,
|
||||
byCategory,
|
||||
bySource,
|
||||
}
|
||||
|
||||
// Sort components by type
|
||||
registry.components.sort((a, b) => a.type.localeCompare(b.type))
|
||||
|
||||
// Write updated registry
|
||||
await fs.writeFile(registryPath, JSON.stringify(registry, null, 2) + '\n')
|
||||
|
||||
console.log(`\n✅ Registry updated successfully!`)
|
||||
console.log(` Total components: ${registry.statistics.total}`)
|
||||
console.log(` JSON compatible: ${registry.statistics.jsonCompatible}`)
|
||||
console.log(`\n📊 By source:`)
|
||||
for (const [source, count] of Object.entries(bySource)) {
|
||||
console.log(` ${source.padEnd(12)}: ${count}`)
|
||||
}
|
||||
console.log(`\n📊 By category:`)
|
||||
for (const [category, count] of Object.entries(byCategory)) {
|
||||
console.log(` ${category.padEnd(12)}: ${count}`)
|
||||
}
|
||||
}
|
||||
|
||||
updateRegistry().catch(console.error)
|
||||
235
frontends/codegen/scripts/validate-json-registry.ts
Normal file
235
frontends/codegen/scripts/validate-json-registry.ts
Normal file
@@ -0,0 +1,235 @@
|
||||
import fs from 'node:fs/promises'
|
||||
import path from 'node:path'
|
||||
import { fileURLToPath, pathToFileURL } from 'node:url'
|
||||
import * as PhosphorIcons from '@phosphor-icons/react'
|
||||
import { JSONUIShowcase } from '../src/components/JSONUIShowcase'
|
||||
|
||||
type ComponentType = unknown
|
||||
|
||||
interface JsonRegistryEntry {
|
||||
name?: string
|
||||
type?: string
|
||||
export?: string
|
||||
source?: string
|
||||
status?: string
|
||||
wrapperRequired?: boolean
|
||||
wrapperComponent?: string
|
||||
wrapperFor?: string
|
||||
load?: {
|
||||
export?: string
|
||||
}
|
||||
deprecated?: unknown
|
||||
}
|
||||
|
||||
interface JsonComponentRegistry {
|
||||
components?: JsonRegistryEntry[]
|
||||
}
|
||||
|
||||
const sourceAliases: Record<string, Record<string, string>> = {
|
||||
atoms: {
|
||||
PageHeader: 'BasicPageHeader',
|
||||
SearchInput: 'BasicSearchInput',
|
||||
},
|
||||
molecules: {},
|
||||
organisms: {},
|
||||
ui: {
|
||||
Chart: 'ChartContainer',
|
||||
Resizable: 'ResizablePanelGroup',
|
||||
},
|
||||
wrappers: {},
|
||||
}
|
||||
|
||||
const explicitComponentAllowlist: Record<string, ComponentType> = {
|
||||
JSONUIShowcase,
|
||||
}
|
||||
|
||||
const getRegistryEntryKey = (entry: JsonRegistryEntry): string | undefined =>
|
||||
entry.name ?? entry.type
|
||||
|
||||
const getRegistryEntryExportName = (entry: JsonRegistryEntry): string | undefined =>
|
||||
entry.load?.export ?? entry.export ?? getRegistryEntryKey(entry)
|
||||
|
||||
const buildComponentMapFromExports = (
|
||||
exports: Record<string, unknown>
|
||||
): Record<string, ComponentType> => {
|
||||
return Object.entries(exports).reduce<Record<string, ComponentType>>((acc, [key, value]) => {
|
||||
if (value && (typeof value === 'function' || typeof value === 'object')) {
|
||||
acc[key] = value as ComponentType
|
||||
}
|
||||
return acc
|
||||
}, {})
|
||||
}
|
||||
|
||||
const buildComponentMapFromModules = (
|
||||
modules: Record<string, unknown>
|
||||
): Record<string, ComponentType> => {
|
||||
return Object.values(modules).reduce<Record<string, ComponentType>>((acc, moduleExports) => {
|
||||
if (!moduleExports || typeof moduleExports !== 'object') {
|
||||
return acc
|
||||
}
|
||||
Object.entries(buildComponentMapFromExports(moduleExports as Record<string, unknown>)).forEach(
|
||||
([key, component]) => {
|
||||
acc[key] = component
|
||||
}
|
||||
)
|
||||
return acc
|
||||
}, {})
|
||||
}
|
||||
|
||||
const listFiles = async (options: {
|
||||
directory: string
|
||||
extensions: string[]
|
||||
recursive: boolean
|
||||
}): Promise<string[]> => {
|
||||
const { directory, extensions, recursive } = options
|
||||
const entries = await fs.readdir(directory, { withFileTypes: true })
|
||||
const files: string[] = []
|
||||
|
||||
await Promise.all(
|
||||
entries.map(async (entry) => {
|
||||
const fullPath = path.join(directory, entry.name)
|
||||
if (entry.isDirectory()) {
|
||||
if (recursive) {
|
||||
const nested = await listFiles({ directory: fullPath, extensions, recursive })
|
||||
files.push(...nested)
|
||||
}
|
||||
return
|
||||
}
|
||||
if (extensions.includes(path.extname(entry.name))) {
|
||||
files.push(fullPath)
|
||||
}
|
||||
})
|
||||
)
|
||||
|
||||
return files
|
||||
}
|
||||
|
||||
const importModules = async (files: string[]): Promise<Record<string, unknown>> => {
|
||||
const modules: Record<string, unknown> = {}
|
||||
await Promise.all(
|
||||
files.map(async (file) => {
|
||||
const moduleExports = await import(pathToFileURL(file).href)
|
||||
modules[file] = moduleExports
|
||||
})
|
||||
)
|
||||
return modules
|
||||
}
|
||||
|
||||
const validateRegistry = async () => {
|
||||
const scriptDir = path.dirname(fileURLToPath(import.meta.url))
|
||||
const rootDir = path.resolve(scriptDir, '..')
|
||||
const registryPath = path.join(rootDir, 'json-components-registry.json')
|
||||
|
||||
const registryRaw = await fs.readFile(registryPath, 'utf8')
|
||||
const registry = JSON.parse(registryRaw) as JsonComponentRegistry
|
||||
const registryEntries = registry.components ?? []
|
||||
const registryEntryByType = new Map(
|
||||
registryEntries
|
||||
.map((entry) => {
|
||||
const entryKey = getRegistryEntryKey(entry)
|
||||
return entryKey ? [entryKey, entry] : null
|
||||
})
|
||||
.filter((entry): entry is [string, JsonRegistryEntry] => Boolean(entry))
|
||||
)
|
||||
|
||||
const sourceConfigs = [
|
||||
{
|
||||
source: 'atoms',
|
||||
directory: path.join(rootDir, 'src/components/atoms'),
|
||||
extensions: ['.tsx'],
|
||||
recursive: false,
|
||||
},
|
||||
{
|
||||
source: 'molecules',
|
||||
directory: path.join(rootDir, 'src/components/molecules'),
|
||||
extensions: ['.tsx'],
|
||||
recursive: false,
|
||||
},
|
||||
{
|
||||
source: 'organisms',
|
||||
directory: path.join(rootDir, 'src/components/organisms'),
|
||||
extensions: ['.tsx'],
|
||||
recursive: false,
|
||||
},
|
||||
{
|
||||
source: 'ui',
|
||||
directory: path.join(rootDir, 'src/components/ui'),
|
||||
extensions: ['.ts', '.tsx'],
|
||||
recursive: true,
|
||||
},
|
||||
{
|
||||
source: 'wrappers',
|
||||
directory: path.join(rootDir, 'src/lib/json-ui/wrappers'),
|
||||
extensions: ['.tsx'],
|
||||
recursive: false,
|
||||
},
|
||||
]
|
||||
|
||||
const componentMaps: Record<string, Record<string, ComponentType>> = {}
|
||||
await Promise.all(
|
||||
sourceConfigs.map(async (config) => {
|
||||
const files = await listFiles({
|
||||
directory: config.directory,
|
||||
extensions: config.extensions,
|
||||
recursive: config.recursive,
|
||||
})
|
||||
const modules = await importModules(files)
|
||||
componentMaps[config.source] = buildComponentMapFromModules(modules)
|
||||
})
|
||||
)
|
||||
|
||||
componentMaps.icons = buildComponentMapFromExports(PhosphorIcons)
|
||||
|
||||
const errors: string[] = []
|
||||
|
||||
registryEntries.forEach((entry) => {
|
||||
const entryKey = getRegistryEntryKey(entry)
|
||||
const entryExportName = getRegistryEntryExportName(entry)
|
||||
|
||||
if (!entryKey || !entryExportName) {
|
||||
errors.push(`Entry missing name/type/export: ${JSON.stringify(entry)}`)
|
||||
return
|
||||
}
|
||||
|
||||
const source = entry.source
|
||||
if (!source || !componentMaps[source]) {
|
||||
errors.push(`${entryKey}: unknown source "${source ?? 'missing'}"`)
|
||||
return
|
||||
}
|
||||
|
||||
const aliasName = sourceAliases[source]?.[entryKey]
|
||||
const component =
|
||||
componentMaps[source][entryExportName] ??
|
||||
(aliasName ? componentMaps[source][aliasName] : undefined) ??
|
||||
explicitComponentAllowlist[entryKey]
|
||||
|
||||
if (!component) {
|
||||
const aliasNote = aliasName ? ` (alias: ${aliasName})` : ''
|
||||
errors.push(
|
||||
`${entryKey} (${source}) did not resolve export "${entryExportName}"${aliasNote}`
|
||||
)
|
||||
}
|
||||
|
||||
if (entry.wrapperRequired) {
|
||||
if (!entry.wrapperComponent) {
|
||||
errors.push(`${entryKey} (${source}) requires a wrapperComponent but none is defined`)
|
||||
return
|
||||
}
|
||||
if (!registryEntryByType.has(entry.wrapperComponent)) {
|
||||
errors.push(
|
||||
`${entryKey} (${source}) references missing wrapperComponent ${entry.wrapperComponent}`
|
||||
)
|
||||
}
|
||||
}
|
||||
})
|
||||
|
||||
if (errors.length > 0) {
|
||||
console.error('❌ JSON component registry export validation failed:')
|
||||
errors.forEach((error) => console.error(`- ${error}`))
|
||||
process.exit(1)
|
||||
}
|
||||
|
||||
console.log('✅ JSON component registry exports are valid.')
|
||||
}
|
||||
|
||||
await validateRegistry()
|
||||
297
frontends/codegen/scripts/validate-json-schemas.ts
Normal file
297
frontends/codegen/scripts/validate-json-schemas.ts
Normal file
@@ -0,0 +1,297 @@
|
||||
import fs from 'fs'
|
||||
import path from 'path'
|
||||
import { fileURLToPath } from 'url'
|
||||
import { UIComponentSchema } from '../src/lib/json-ui/schema'
|
||||
|
||||
interface ComponentDefinitionProp {
|
||||
name: string
|
||||
type: 'string' | 'number' | 'boolean'
|
||||
options?: Array<string | number | boolean>
|
||||
}
|
||||
|
||||
interface ComponentDefinition {
|
||||
type: string
|
||||
props?: ComponentDefinitionProp[]
|
||||
}
|
||||
|
||||
interface ComponentNode {
|
||||
component: Record<string, unknown>
|
||||
path: string
|
||||
}
|
||||
|
||||
const __dirname = path.dirname(fileURLToPath(import.meta.url))
|
||||
const rootDir = path.resolve(__dirname, '..')
|
||||
|
||||
const componentDefinitionsPath = path.join(rootDir, 'src/lib/component-definitions.json')
|
||||
const componentRegistryPath = path.join(rootDir, 'src/lib/json-ui/component-registry.ts')
|
||||
const jsonRegistryPath = path.join(rootDir, 'json-components-registry.json')
|
||||
|
||||
const readJson = (filePath: string) => JSON.parse(fs.readFileSync(filePath, 'utf8'))
|
||||
const readText = (filePath: string) => fs.readFileSync(filePath, 'utf8')
|
||||
|
||||
const componentDefinitions = readJson(componentDefinitionsPath) as ComponentDefinition[]
|
||||
const componentDefinitionMap = new Map(componentDefinitions.map((def) => [def.type, def]))
|
||||
|
||||
const jsonRegistry = readJson(jsonRegistryPath) as {
|
||||
components?: Array<{ type?: string; name?: string; export?: string }>
|
||||
}
|
||||
|
||||
const extractObjectLiteral = (content: string, marker: string) => {
|
||||
const markerIndex = content.indexOf(marker)
|
||||
if (markerIndex === -1) {
|
||||
throw new Error(`Unable to locate ${marker} in component registry file`)
|
||||
}
|
||||
const braceStart = content.indexOf('{', markerIndex)
|
||||
if (braceStart === -1) {
|
||||
throw new Error(`Unable to locate opening brace for ${marker}`)
|
||||
}
|
||||
let depth = 0
|
||||
for (let i = braceStart; i < content.length; i += 1) {
|
||||
const char = content[i]
|
||||
if (char === '{') depth += 1
|
||||
if (char === '}') depth -= 1
|
||||
if (depth === 0) {
|
||||
return content.slice(braceStart, i + 1)
|
||||
}
|
||||
}
|
||||
throw new Error(`Unable to locate closing brace for ${marker}`)
|
||||
}
|
||||
|
||||
const extractKeysFromObjectLiteral = (literal: string) => {
|
||||
const body = literal.trim().replace(/^\{/, '').replace(/\}$/, '')
|
||||
const entries = body
|
||||
.split(',')
|
||||
.map((entry) => entry.trim())
|
||||
.filter(Boolean)
|
||||
const keys = new Set<string>()
|
||||
|
||||
entries.forEach((entry) => {
|
||||
if (entry.startsWith('...')) {
|
||||
return
|
||||
}
|
||||
const [keyPart] = entry.split(':')
|
||||
const key = keyPart.trim()
|
||||
if (key) {
|
||||
keys.add(key)
|
||||
}
|
||||
})
|
||||
|
||||
return keys
|
||||
}
|
||||
|
||||
const componentRegistryContent = readText(componentRegistryPath)
|
||||
const primitiveKeys = extractKeysFromObjectLiteral(
|
||||
extractObjectLiteral(componentRegistryContent, 'export const primitiveComponents')
|
||||
)
|
||||
const shadcnKeys = extractKeysFromObjectLiteral(
|
||||
extractObjectLiteral(componentRegistryContent, 'export const shadcnComponents')
|
||||
)
|
||||
const wrapperKeys = extractKeysFromObjectLiteral(
|
||||
extractObjectLiteral(componentRegistryContent, 'export const jsonWrapperComponents')
|
||||
)
|
||||
const iconKeys = extractKeysFromObjectLiteral(
|
||||
extractObjectLiteral(componentRegistryContent, 'export const iconComponents')
|
||||
)
|
||||
|
||||
const registryTypes = new Set<string>(
|
||||
(jsonRegistry.components ?? [])
|
||||
.map((entry) => entry.type ?? entry.name ?? entry.export)
|
||||
.filter((value): value is string => Boolean(value))
|
||||
)
|
||||
|
||||
const validComponentTypes = new Set<string>([
|
||||
...primitiveKeys,
|
||||
...shadcnKeys,
|
||||
...wrapperKeys,
|
||||
...iconKeys,
|
||||
...componentDefinitions.map((def) => def.type),
|
||||
...registryTypes,
|
||||
])
|
||||
|
||||
const schemaRoots = [
|
||||
path.join(rootDir, 'src/config'),
|
||||
path.join(rootDir, 'src/data'),
|
||||
]
|
||||
|
||||
const collectJsonFiles = (dir: string, files: string[] = []) => {
|
||||
if (!fs.existsSync(dir)) {
|
||||
return files
|
||||
}
|
||||
const entries = fs.readdirSync(dir, { withFileTypes: true })
|
||||
entries.forEach((entry) => {
|
||||
const fullPath = path.join(dir, entry.name)
|
||||
if (entry.isDirectory()) {
|
||||
collectJsonFiles(fullPath, files)
|
||||
return
|
||||
}
|
||||
if (entry.isFile() && entry.name.endsWith('.json')) {
|
||||
files.push(fullPath)
|
||||
}
|
||||
})
|
||||
return files
|
||||
}
|
||||
|
||||
const isComponentNode = (value: unknown): value is Record<string, unknown> => {
|
||||
if (!value || typeof value !== 'object') {
|
||||
return false
|
||||
}
|
||||
const candidate = value as Record<string, unknown>
|
||||
if (typeof candidate.id !== 'string' || typeof candidate.type !== 'string') {
|
||||
return false
|
||||
}
|
||||
return (
|
||||
'props' in candidate ||
|
||||
'children' in candidate ||
|
||||
'className' in candidate ||
|
||||
'bindings' in candidate ||
|
||||
'events' in candidate ||
|
||||
'dataBinding' in candidate ||
|
||||
'style' in candidate
|
||||
)
|
||||
}
|
||||
|
||||
const findComponents = (value: unknown, currentPath: string): ComponentNode[] => {
|
||||
const components: ComponentNode[] = []
|
||||
if (Array.isArray(value)) {
|
||||
value.forEach((item, index) => {
|
||||
components.push(...findComponents(item, `${currentPath}[${index}]`))
|
||||
})
|
||||
return components
|
||||
}
|
||||
if (!value || typeof value !== 'object') {
|
||||
return components
|
||||
}
|
||||
|
||||
const candidate = value as Record<string, unknown>
|
||||
if (isComponentNode(candidate)) {
|
||||
components.push({ component: candidate, path: currentPath })
|
||||
}
|
||||
|
||||
Object.entries(candidate).forEach(([key, child]) => {
|
||||
const nextPath = currentPath ? `${currentPath}.${key}` : key
|
||||
components.push(...findComponents(child, nextPath))
|
||||
})
|
||||
|
||||
return components
|
||||
}
|
||||
|
||||
const isTemplateBinding = (value: unknown) =>
|
||||
typeof value === 'string' && value.includes('{{') && value.includes('}}')
|
||||
|
||||
const validateProps = (
|
||||
component: Record<string, unknown>,
|
||||
filePath: string,
|
||||
componentPath: string,
|
||||
errors: string[]
|
||||
) => {
|
||||
const definition = componentDefinitionMap.get(component.type as string)
|
||||
const props = component.props
|
||||
|
||||
if (!definition || !definition.props || !props || typeof props !== 'object') {
|
||||
return
|
||||
}
|
||||
|
||||
const propDefinitions = new Map(definition.props.map((prop) => [prop.name, prop]))
|
||||
|
||||
Object.entries(props as Record<string, unknown>).forEach(([propName, propValue]) => {
|
||||
const propDefinition = propDefinitions.get(propName)
|
||||
if (!propDefinition) {
|
||||
errors.push(
|
||||
`${filePath} -> ${componentPath}: Unknown prop "${propName}" for component type "${component.type}"`
|
||||
)
|
||||
return
|
||||
}
|
||||
|
||||
const expectedType = propDefinition.type
|
||||
const actualType = Array.isArray(propValue) ? 'array' : typeof propValue
|
||||
|
||||
if (
|
||||
expectedType === 'string' &&
|
||||
actualType !== 'string' &&
|
||||
propValue !== undefined
|
||||
) {
|
||||
errors.push(
|
||||
`${filePath} -> ${componentPath}: Prop "${propName}" expected string but got ${actualType}`
|
||||
)
|
||||
return
|
||||
}
|
||||
|
||||
if (
|
||||
expectedType === 'number' &&
|
||||
actualType !== 'number' &&
|
||||
!isTemplateBinding(propValue)
|
||||
) {
|
||||
errors.push(
|
||||
`${filePath} -> ${componentPath}: Prop "${propName}" expected number but got ${actualType}`
|
||||
)
|
||||
return
|
||||
}
|
||||
|
||||
if (
|
||||
expectedType === 'boolean' &&
|
||||
actualType !== 'boolean' &&
|
||||
!isTemplateBinding(propValue)
|
||||
) {
|
||||
errors.push(
|
||||
`${filePath} -> ${componentPath}: Prop "${propName}" expected boolean but got ${actualType}`
|
||||
)
|
||||
return
|
||||
}
|
||||
|
||||
if (propDefinition.options && propValue !== undefined) {
|
||||
if (!propDefinition.options.includes(propValue as string | number | boolean)) {
|
||||
errors.push(
|
||||
`${filePath} -> ${componentPath}: Prop "${propName}" value must be one of ${propDefinition.options.join(', ')}`
|
||||
)
|
||||
}
|
||||
}
|
||||
})
|
||||
}
|
||||
|
||||
const validateComponentsInFile = (filePath: string, errors: string[]) => {
|
||||
let parsed: unknown
|
||||
try {
|
||||
parsed = readJson(filePath)
|
||||
} catch (error) {
|
||||
errors.push(`${filePath}: Unable to parse JSON - ${(error as Error).message}`)
|
||||
return
|
||||
}
|
||||
|
||||
const components = findComponents(parsed, 'root')
|
||||
if (components.length === 0) {
|
||||
return
|
||||
}
|
||||
|
||||
components.forEach(({ component, path: componentPath }) => {
|
||||
const parseResult = UIComponentSchema.safeParse(component)
|
||||
if (!parseResult.success) {
|
||||
const issueMessages = parseResult.error.issues
|
||||
.map((issue) => ` - ${issue.path.join('.')}: ${issue.message}`)
|
||||
.join('\n')
|
||||
errors.push(
|
||||
`${filePath} -> ${componentPath}: Schema validation failed\n${issueMessages}`
|
||||
)
|
||||
}
|
||||
|
||||
if (!validComponentTypes.has(component.type as string)) {
|
||||
errors.push(
|
||||
`${filePath} -> ${componentPath}: Unknown component type "${component.type}"`
|
||||
)
|
||||
}
|
||||
|
||||
validateProps(component, filePath, componentPath, errors)
|
||||
})
|
||||
}
|
||||
|
||||
const jsonFiles = schemaRoots.flatMap((dir) => collectJsonFiles(dir))
|
||||
const errors: string[] = []
|
||||
|
||||
jsonFiles.forEach((filePath) => validateComponentsInFile(filePath, errors))
|
||||
|
||||
if (errors.length > 0) {
|
||||
console.error('JSON schema validation failed:')
|
||||
errors.forEach((error) => console.error(`- ${error}`))
|
||||
process.exit(1)
|
||||
}
|
||||
|
||||
console.log('JSON schema validation passed.')
|
||||
82
frontends/codegen/scripts/validate-json-ui-registry.cjs
Normal file
82
frontends/codegen/scripts/validate-json-ui-registry.cjs
Normal file
@@ -0,0 +1,82 @@
|
||||
#!/usr/bin/env node
|
||||
|
||||
const fs = require('fs')
|
||||
const path = require('path')
|
||||
|
||||
const registryPath = path.join(process.cwd(), 'json-components-registry.json')
|
||||
const schemaPath = path.join(process.cwd(), 'src', 'schemas', 'registry-validation.json')
|
||||
|
||||
if (!fs.existsSync(registryPath)) {
|
||||
console.error('❌ Could not find json-components-registry.json')
|
||||
process.exit(1)
|
||||
}
|
||||
|
||||
if (!fs.existsSync(schemaPath)) {
|
||||
console.error('❌ Could not find src/schemas/registry-validation.json')
|
||||
process.exit(1)
|
||||
}
|
||||
|
||||
const registry = JSON.parse(fs.readFileSync(registryPath, 'utf8'))
|
||||
const schema = JSON.parse(fs.readFileSync(schemaPath, 'utf8'))
|
||||
|
||||
const primitiveTypes = new Set([
|
||||
'div',
|
||||
'span',
|
||||
'p',
|
||||
'h1',
|
||||
'h2',
|
||||
'h3',
|
||||
'h4',
|
||||
'h5',
|
||||
'h6',
|
||||
'section',
|
||||
'article',
|
||||
'header',
|
||||
'footer',
|
||||
'main',
|
||||
'aside',
|
||||
'nav',
|
||||
])
|
||||
|
||||
const registryTypes = new Set()
|
||||
|
||||
for (const entry of registry.components || []) {
|
||||
if (entry.source === 'atoms' || entry.source === 'molecules') {
|
||||
const name = entry.export || entry.name || entry.type
|
||||
if (name) {
|
||||
registryTypes.add(name)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
const schemaTypes = new Set()
|
||||
|
||||
const collectTypes = (components) => {
|
||||
if (!components) return
|
||||
if (Array.isArray(components)) {
|
||||
components.forEach(collectTypes)
|
||||
return
|
||||
}
|
||||
if (components.type) {
|
||||
schemaTypes.add(components.type)
|
||||
}
|
||||
if (components.children) {
|
||||
collectTypes(components.children)
|
||||
}
|
||||
}
|
||||
|
||||
collectTypes(schema.components || [])
|
||||
|
||||
const missing = []
|
||||
for (const type of schemaTypes) {
|
||||
if (!primitiveTypes.has(type) && !registryTypes.has(type)) {
|
||||
missing.push(type)
|
||||
}
|
||||
}
|
||||
|
||||
if (missing.length) {
|
||||
console.error(`❌ Missing registry entries for: ${missing.join(', ')}`)
|
||||
process.exit(1)
|
||||
}
|
||||
|
||||
console.log('✅ JSON UI registry validation passed for primitives and atom/molecule components.')
|
||||
224
frontends/codegen/scripts/validate-qemu.sh
Normal file
224
frontends/codegen/scripts/validate-qemu.sh
Normal file
@@ -0,0 +1,224 @@
|
||||
#!/bin/bash
|
||||
|
||||
# QEMU Multi-Architecture Validation Script
|
||||
# This script validates that QEMU is properly configured and multi-arch builds work
|
||||
|
||||
set -e
|
||||
|
||||
echo "🔍 QEMU Multi-Architecture Validation"
|
||||
echo "======================================"
|
||||
echo ""
|
||||
|
||||
# Colors
|
||||
RED='\033[0;31m'
|
||||
GREEN='\033[0;32m'
|
||||
YELLOW='\033[1;33m'
|
||||
BLUE='\033[0;34m'
|
||||
NC='\033[0m'
|
||||
|
||||
PASSED=0
|
||||
FAILED=0
|
||||
|
||||
# Function to print test results
|
||||
print_result() {
|
||||
if [ $1 -eq 0 ]; then
|
||||
echo -e "${GREEN}✅ PASS${NC}: $2"
|
||||
((PASSED++))
|
||||
else
|
||||
echo -e "${RED}❌ FAIL${NC}: $2"
|
||||
((FAILED++))
|
||||
fi
|
||||
}
|
||||
|
||||
echo -e "${BLUE}Step 1: Checking Docker installation${NC}"
|
||||
echo "--------------------------------------"
|
||||
|
||||
if command -v docker &> /dev/null; then
|
||||
DOCKER_VERSION=$(docker --version)
|
||||
print_result 0 "Docker is installed: $DOCKER_VERSION"
|
||||
else
|
||||
print_result 1 "Docker is not installed"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
echo ""
|
||||
echo -e "${BLUE}Step 2: Checking Docker Buildx${NC}"
|
||||
echo "--------------------------------------"
|
||||
|
||||
if docker buildx version &> /dev/null; then
|
||||
BUILDX_VERSION=$(docker buildx version)
|
||||
print_result 0 "Docker Buildx is available: $BUILDX_VERSION"
|
||||
else
|
||||
print_result 1 "Docker Buildx is not available"
|
||||
echo "Installing Docker Buildx..."
|
||||
docker buildx install
|
||||
fi
|
||||
|
||||
echo ""
|
||||
echo -e "${BLUE}Step 3: Setting up QEMU${NC}"
|
||||
echo "--------------------------------------"
|
||||
|
||||
echo "Installing QEMU user static binaries..."
|
||||
if docker run --rm --privileged multiarch/qemu-user-static --reset -p yes > /dev/null 2>&1; then
|
||||
print_result 0 "QEMU installation successful"
|
||||
else
|
||||
print_result 1 "QEMU installation failed"
|
||||
fi
|
||||
|
||||
echo ""
|
||||
echo -e "${BLUE}Step 4: Checking QEMU binaries${NC}"
|
||||
echo "--------------------------------------"
|
||||
|
||||
if docker run --rm multiarch/qemu-user-static --version > /dev/null 2>&1; then
|
||||
QEMU_VERSION=$(docker run --rm multiarch/qemu-user-static --version | head -n 1)
|
||||
print_result 0 "QEMU binaries are functional: $QEMU_VERSION"
|
||||
else
|
||||
print_result 1 "QEMU binaries not accessible"
|
||||
fi
|
||||
|
||||
echo ""
|
||||
echo -e "${BLUE}Step 5: Setting up Buildx builder${NC}"
|
||||
echo "--------------------------------------"
|
||||
|
||||
if docker buildx inspect multiarch &> /dev/null; then
|
||||
echo "Builder 'multiarch' already exists"
|
||||
docker buildx use multiarch
|
||||
print_result 0 "Using existing builder 'multiarch'"
|
||||
else
|
||||
if docker buildx create --name multiarch --driver docker-container --use > /dev/null 2>&1; then
|
||||
print_result 0 "Created new builder 'multiarch'"
|
||||
else
|
||||
print_result 1 "Failed to create builder"
|
||||
fi
|
||||
fi
|
||||
|
||||
if docker buildx inspect --bootstrap > /dev/null 2>&1; then
|
||||
print_result 0 "Builder bootstrap successful"
|
||||
else
|
||||
print_result 1 "Builder bootstrap failed"
|
||||
fi
|
||||
|
||||
echo ""
|
||||
echo -e "${BLUE}Step 6: Checking supported platforms${NC}"
|
||||
echo "--------------------------------------"
|
||||
|
||||
PLATFORMS=$(docker buildx inspect multiarch | grep "Platforms:" | cut -d: -f2)
|
||||
echo "Available platforms:$PLATFORMS"
|
||||
|
||||
if echo "$PLATFORMS" | grep -q "linux/amd64"; then
|
||||
print_result 0 "AMD64 platform supported"
|
||||
else
|
||||
print_result 1 "AMD64 platform not supported"
|
||||
fi
|
||||
|
||||
if echo "$PLATFORMS" | grep -q "linux/arm64"; then
|
||||
print_result 0 "ARM64 platform supported"
|
||||
else
|
||||
print_result 1 "ARM64 platform not supported"
|
||||
fi
|
||||
|
||||
echo ""
|
||||
echo -e "${BLUE}Step 7: Testing multi-arch build (dry run)${NC}"
|
||||
echo "--------------------------------------"
|
||||
|
||||
# Create a simple test Dockerfile
|
||||
TEST_DIR=$(mktemp -d)
|
||||
cat > "$TEST_DIR/Dockerfile" << 'EOF'
|
||||
FROM alpine:latest
|
||||
RUN echo "Architecture: $(uname -m)"
|
||||
CMD ["echo", "Multi-arch test successful"]
|
||||
EOF
|
||||
|
||||
echo "Testing build for linux/amd64..."
|
||||
if docker buildx build --platform linux/amd64 -t test-qemu:amd64 "$TEST_DIR" > /dev/null 2>&1; then
|
||||
print_result 0 "AMD64 build successful"
|
||||
else
|
||||
print_result 1 "AMD64 build failed"
|
||||
fi
|
||||
|
||||
echo "Testing build for linux/arm64..."
|
||||
if docker buildx build --platform linux/arm64 -t test-qemu:arm64 "$TEST_DIR" > /dev/null 2>&1; then
|
||||
print_result 0 "ARM64 build successful (cross-compiled)"
|
||||
else
|
||||
print_result 1 "ARM64 build failed"
|
||||
fi
|
||||
|
||||
echo "Testing multi-platform build..."
|
||||
if docker buildx build --platform linux/amd64,linux/arm64 -t test-qemu:multi "$TEST_DIR" > /dev/null 2>&1; then
|
||||
print_result 0 "Multi-platform build successful"
|
||||
else
|
||||
print_result 1 "Multi-platform build failed"
|
||||
fi
|
||||
|
||||
# Cleanup
|
||||
rm -rf "$TEST_DIR"
|
||||
|
||||
echo ""
|
||||
echo -e "${BLUE}Step 8: Validating CI/CD configurations${NC}"
|
||||
echo "--------------------------------------"
|
||||
|
||||
# Check GitHub Actions
|
||||
if grep -q "docker/setup-qemu-action" .github/workflows/ci.yml 2>/dev/null; then
|
||||
print_result 0 "GitHub Actions CI has QEMU configured"
|
||||
else
|
||||
print_result 1 "GitHub Actions CI missing QEMU"
|
||||
fi
|
||||
|
||||
if grep -q "docker/setup-qemu-action" .github/workflows/release.yml 2>/dev/null; then
|
||||
print_result 0 "GitHub Actions Release has QEMU configured"
|
||||
else
|
||||
print_result 1 "GitHub Actions Release missing QEMU"
|
||||
fi
|
||||
|
||||
# Check CircleCI
|
||||
if grep -q "multiarch/qemu-user-static" .circleci/config.yml 2>/dev/null; then
|
||||
print_result 0 "CircleCI has QEMU configured"
|
||||
else
|
||||
print_result 1 "CircleCI missing QEMU"
|
||||
fi
|
||||
|
||||
# Check GitLab CI
|
||||
if grep -q "multiarch/qemu-user-static" .gitlab-ci.yml 2>/dev/null; then
|
||||
print_result 0 "GitLab CI has QEMU configured"
|
||||
else
|
||||
print_result 1 "GitLab CI missing QEMU"
|
||||
fi
|
||||
|
||||
# Check Jenkins
|
||||
if grep -q "multiarch/qemu-user-static" Jenkinsfile 2>/dev/null; then
|
||||
print_result 0 "Jenkins has QEMU configured"
|
||||
else
|
||||
print_result 1 "Jenkins missing QEMU"
|
||||
fi
|
||||
|
||||
echo ""
|
||||
echo "======================================"
|
||||
echo -e "${BLUE}Validation Summary${NC}"
|
||||
echo "======================================"
|
||||
echo ""
|
||||
echo -e "${GREEN}Passed: $PASSED${NC}"
|
||||
echo -e "${RED}Failed: $FAILED${NC}"
|
||||
echo ""
|
||||
|
||||
if [ $FAILED -eq 0 ]; then
|
||||
echo -e "${GREEN}🎉 All validations passed!${NC}"
|
||||
echo ""
|
||||
echo "Your system is ready for multi-architecture builds."
|
||||
echo ""
|
||||
echo "Next steps:"
|
||||
echo " 1. Run: ./scripts/build-multiarch.sh myapp latest"
|
||||
echo " 2. Or push to CI/CD and watch multi-arch builds happen automatically"
|
||||
echo ""
|
||||
exit 0
|
||||
else
|
||||
echo -e "${RED}⚠️ Some validations failed${NC}"
|
||||
echo ""
|
||||
echo "Please review the failures above and fix them before proceeding."
|
||||
echo ""
|
||||
echo "Common fixes:"
|
||||
echo " - Install Docker: https://docs.docker.com/get-docker/"
|
||||
echo " - Update Docker to latest version"
|
||||
echo " - Run with sudo if permission denied"
|
||||
echo ""
|
||||
exit 1
|
||||
fi
|
||||
176
frontends/codegen/scripts/validate-supported-components.cjs
Normal file
176
frontends/codegen/scripts/validate-supported-components.cjs
Normal file
@@ -0,0 +1,176 @@
|
||||
const fs = require('fs')
|
||||
const path = require('path')
|
||||
|
||||
const rootDir = path.resolve(__dirname, '..')
|
||||
const registryPath = path.join(rootDir, 'json-components-registry.json')
|
||||
const definitionsPath = path.join(rootDir, 'src/lib/component-definitions.json')
|
||||
const componentTypesPath = path.join(rootDir, 'src/types/json-ui-component-types.ts')
|
||||
const uiRegistryPath = path.join(rootDir, 'src/lib/json-ui/component-registry.ts')
|
||||
const atomIndexPath = path.join(rootDir, 'src/components/atoms/index.ts')
|
||||
const moleculeIndexPath = path.join(rootDir, 'src/components/molecules/index.ts')
|
||||
|
||||
const readJson = (filePath) => JSON.parse(fs.readFileSync(filePath, 'utf8'))
|
||||
const readText = (filePath) => fs.readFileSync(filePath, 'utf8')
|
||||
|
||||
const registryData = readJson(registryPath)
|
||||
const supportedComponents = (registryData.components ?? []).filter(
|
||||
(component) => component.status === 'supported'
|
||||
)
|
||||
|
||||
const componentDefinitions = readJson(definitionsPath)
|
||||
const definitionTypes = new Set(componentDefinitions.map((def) => def.type))
|
||||
|
||||
const componentTypesContent = readText(componentTypesPath)
|
||||
const componentTypeSet = new Set()
|
||||
const componentTypeRegex = /"([^"]+)"/g
|
||||
let match
|
||||
while ((match = componentTypeRegex.exec(componentTypesContent)) !== null) {
|
||||
componentTypeSet.add(match[1])
|
||||
}
|
||||
|
||||
const extractObjectLiteral = (content, marker) => {
|
||||
const markerIndex = content.indexOf(marker)
|
||||
if (markerIndex === -1) {
|
||||
throw new Error(`Unable to locate ${marker} in component registry file`)
|
||||
}
|
||||
const braceStart = content.indexOf('{', markerIndex)
|
||||
if (braceStart === -1) {
|
||||
throw new Error(`Unable to locate opening brace for ${marker}`)
|
||||
}
|
||||
let depth = 0
|
||||
for (let i = braceStart; i < content.length; i += 1) {
|
||||
const char = content[i]
|
||||
if (char === '{') depth += 1
|
||||
if (char === '}') depth -= 1
|
||||
if (depth === 0) {
|
||||
return content.slice(braceStart, i + 1)
|
||||
}
|
||||
}
|
||||
throw new Error(`Unable to locate closing brace for ${marker}`)
|
||||
}
|
||||
|
||||
const extractKeysFromObjectLiteral = (literal) => {
|
||||
const body = literal.trim().replace(/^\{/, '').replace(/\}$/, '')
|
||||
const entries = body
|
||||
.split(',')
|
||||
.map((entry) => entry.trim())
|
||||
.filter(Boolean)
|
||||
const keys = new Set()
|
||||
|
||||
entries.forEach((entry) => {
|
||||
if (entry.startsWith('...')) {
|
||||
return
|
||||
}
|
||||
const [keyPart] = entry.split(':')
|
||||
const key = keyPart.trim()
|
||||
if (key) {
|
||||
keys.add(key)
|
||||
}
|
||||
})
|
||||
|
||||
return keys
|
||||
}
|
||||
|
||||
const uiRegistryContent = readText(uiRegistryPath)
|
||||
const primitiveKeys = extractKeysFromObjectLiteral(
|
||||
extractObjectLiteral(uiRegistryContent, 'export const primitiveComponents')
|
||||
)
|
||||
const shadcnKeys = extractKeysFromObjectLiteral(
|
||||
extractObjectLiteral(uiRegistryContent, 'export const shadcnComponents')
|
||||
)
|
||||
const wrapperKeys = extractKeysFromObjectLiteral(
|
||||
extractObjectLiteral(uiRegistryContent, 'export const jsonWrapperComponents')
|
||||
)
|
||||
const iconKeys = extractKeysFromObjectLiteral(
|
||||
extractObjectLiteral(uiRegistryContent, 'export const iconComponents')
|
||||
)
|
||||
|
||||
const extractExports = (content) => {
|
||||
const exportsSet = new Set()
|
||||
const exportRegex = /export\s+\{([^}]+)\}\s+from/g
|
||||
let exportMatch
|
||||
while ((exportMatch = exportRegex.exec(content)) !== null) {
|
||||
const names = exportMatch[1]
|
||||
.split(',')
|
||||
.map((name) => name.trim())
|
||||
.filter(Boolean)
|
||||
names.forEach((name) => {
|
||||
const [exportName] = name.split(/\s+as\s+/)
|
||||
if (exportName) {
|
||||
exportsSet.add(exportName.trim())
|
||||
}
|
||||
})
|
||||
}
|
||||
return exportsSet
|
||||
}
|
||||
|
||||
const atomExports = extractExports(readText(atomIndexPath))
|
||||
const moleculeExports = extractExports(readText(moleculeIndexPath))
|
||||
|
||||
const uiRegistryKeys = new Set([
|
||||
...primitiveKeys,
|
||||
...shadcnKeys,
|
||||
...wrapperKeys,
|
||||
...iconKeys,
|
||||
...atomExports,
|
||||
...moleculeExports,
|
||||
])
|
||||
|
||||
const missingInTypes = []
|
||||
const missingInDefinitions = []
|
||||
const missingInRegistry = []
|
||||
|
||||
supportedComponents.forEach((component) => {
|
||||
const typeName = component.type ?? component.name ?? component.export
|
||||
const registryName = component.export ?? component.name ?? component.type
|
||||
|
||||
if (!typeName) {
|
||||
return
|
||||
}
|
||||
|
||||
if (!componentTypeSet.has(typeName)) {
|
||||
missingInTypes.push(typeName)
|
||||
}
|
||||
|
||||
if (!definitionTypes.has(typeName)) {
|
||||
missingInDefinitions.push(typeName)
|
||||
}
|
||||
|
||||
const source = component.source ?? 'unknown'
|
||||
let registryHasComponent = uiRegistryKeys.has(registryName)
|
||||
|
||||
if (source === 'atoms') {
|
||||
registryHasComponent = atomExports.has(registryName)
|
||||
}
|
||||
if (source === 'molecules') {
|
||||
registryHasComponent = moleculeExports.has(registryName)
|
||||
}
|
||||
if (source === 'ui') {
|
||||
registryHasComponent = shadcnKeys.has(registryName)
|
||||
}
|
||||
|
||||
if (!registryHasComponent) {
|
||||
missingInRegistry.push(`${registryName} (${source})`)
|
||||
}
|
||||
})
|
||||
|
||||
const unique = (list) => Array.from(new Set(list)).sort()
|
||||
|
||||
const errors = []
|
||||
if (missingInTypes.length > 0) {
|
||||
errors.push(`Missing in ComponentType union: ${unique(missingInTypes).join(', ')}`)
|
||||
}
|
||||
if (missingInDefinitions.length > 0) {
|
||||
errors.push(`Missing in component definitions: ${unique(missingInDefinitions).join(', ')}`)
|
||||
}
|
||||
if (missingInRegistry.length > 0) {
|
||||
errors.push(`Missing in UI registry mapping: ${unique(missingInRegistry).join(', ')}`)
|
||||
}
|
||||
|
||||
if (errors.length > 0) {
|
||||
console.error('Supported component validation failed:')
|
||||
errors.forEach((error) => console.error(`- ${error}`))
|
||||
process.exit(1)
|
||||
}
|
||||
|
||||
console.log('Supported component validation passed.')
|
||||
80
frontends/codegen/scripts/verify-docker-build.sh
Normal file
80
frontends/codegen/scripts/verify-docker-build.sh
Normal file
@@ -0,0 +1,80 @@
|
||||
#!/bin/bash
|
||||
|
||||
# Docker Build Verification Script
|
||||
# Checks that all prerequisites are met before building Docker image
|
||||
|
||||
set -e
|
||||
|
||||
echo "🔍 Checking Docker build prerequisites..."
|
||||
echo ""
|
||||
|
||||
# Check if Dockerfile exists
|
||||
if [ ! -f "Dockerfile" ]; then
|
||||
echo "❌ Dockerfile not found"
|
||||
exit 1
|
||||
fi
|
||||
echo "✅ Dockerfile found"
|
||||
|
||||
# Check if package.json exists
|
||||
if [ ! -f "package.json" ]; then
|
||||
echo "❌ package.json not found"
|
||||
exit 1
|
||||
fi
|
||||
echo "✅ package.json found"
|
||||
|
||||
# Check if workspace packages exist
|
||||
if [ ! -d "packages/spark-tools" ]; then
|
||||
echo "❌ packages/spark-tools directory not found"
|
||||
exit 1
|
||||
fi
|
||||
echo "✅ packages/spark-tools directory found"
|
||||
|
||||
if [ ! -d "packages/spark" ]; then
|
||||
echo "❌ packages/spark directory not found"
|
||||
exit 1
|
||||
fi
|
||||
echo "✅ packages/spark directory found"
|
||||
|
||||
# Check if spark-tools is built
|
||||
if [ ! -d "packages/spark-tools/dist" ]; then
|
||||
echo "⚠️ packages/spark-tools/dist not found - building now..."
|
||||
cd packages/spark-tools
|
||||
npm install
|
||||
npm run build
|
||||
cd ../..
|
||||
echo "✅ Built spark-tools"
|
||||
else
|
||||
echo "✅ packages/spark-tools/dist found"
|
||||
fi
|
||||
|
||||
# Verify critical files in dist
|
||||
CRITICAL_FILES=(
|
||||
"packages/spark-tools/dist/sparkVitePlugin.js"
|
||||
"packages/spark-tools/dist/index.js"
|
||||
"packages/spark-tools/dist/spark.js"
|
||||
)
|
||||
|
||||
for file in "${CRITICAL_FILES[@]}"; do
|
||||
if [ ! -f "$file" ]; then
|
||||
echo "❌ Critical file missing: $file"
|
||||
exit 1
|
||||
fi
|
||||
done
|
||||
echo "✅ All critical dist files present"
|
||||
|
||||
# Check Docker is available
|
||||
if ! command -v docker &> /dev/null; then
|
||||
echo "⚠️ Docker not found - skipping Docker checks"
|
||||
else
|
||||
echo "✅ Docker is available"
|
||||
docker --version
|
||||
fi
|
||||
|
||||
echo ""
|
||||
echo "🎉 All prerequisites satisfied!"
|
||||
echo ""
|
||||
echo "You can now build the Docker image with:"
|
||||
echo " docker build -t codeforge:local ."
|
||||
echo ""
|
||||
echo "Or run the full CI pipeline locally with GitHub Actions:"
|
||||
echo " act -j docker-build"
|
||||
118
frontends/codegen/scripts/verify-packages-removal.sh
Normal file
118
frontends/codegen/scripts/verify-packages-removal.sh
Normal file
@@ -0,0 +1,118 @@
|
||||
#!/bin/bash
|
||||
|
||||
# Packages Folder Removal Verification Script
|
||||
# This script verifies that all packages folder dependencies have been removed
|
||||
|
||||
set -e
|
||||
|
||||
echo "🔍 Verifying packages folder removal..."
|
||||
echo ""
|
||||
|
||||
# Colors for output
|
||||
RED='\033[0;31m'
|
||||
GREEN='\033[0;32m'
|
||||
YELLOW='\033[1;33m'
|
||||
NC='\033[0m' # No Color
|
||||
|
||||
ERRORS=0
|
||||
|
||||
# Check 1: Verify packages folder is in .dockerignore
|
||||
echo "📋 Checking .dockerignore..."
|
||||
if grep -q "^packages$" .dockerignore; then
|
||||
echo -e "${GREEN}✓${NC} packages folder is in .dockerignore"
|
||||
else
|
||||
echo -e "${RED}✗${NC} packages folder is NOT in .dockerignore"
|
||||
ERRORS=$((ERRORS + 1))
|
||||
fi
|
||||
echo ""
|
||||
|
||||
# Check 2: Verify Dockerfile doesn't reference packages
|
||||
echo "📋 Checking Dockerfile..."
|
||||
if grep -q "packages" Dockerfile; then
|
||||
echo -e "${RED}✗${NC} Dockerfile still references packages folder"
|
||||
grep -n "packages" Dockerfile
|
||||
ERRORS=$((ERRORS + 1))
|
||||
else
|
||||
echo -e "${GREEN}✓${NC} Dockerfile doesn't reference packages folder"
|
||||
fi
|
||||
echo ""
|
||||
|
||||
# Check 3: Verify package.json doesn't have workspace references
|
||||
echo "📋 Checking package.json..."
|
||||
if grep -q "workspace:" package.json; then
|
||||
echo -e "${RED}✗${NC} package.json still has workspace: protocol references"
|
||||
grep -n "workspace:" package.json
|
||||
ERRORS=$((ERRORS + 1))
|
||||
else
|
||||
echo -e "${GREEN}✓${NC} package.json doesn't have workspace: references"
|
||||
fi
|
||||
echo ""
|
||||
|
||||
# Check 4: Check for any imports from @github/spark or @local/spark-wrapper
|
||||
echo "📋 Checking for old package imports in source code..."
|
||||
OLD_IMPORTS=$(find src -type f \( -name "*.ts" -o -name "*.tsx" \) -exec grep -l "@github/spark\|@local/spark" {} \; 2>/dev/null || true)
|
||||
if [ -n "$OLD_IMPORTS" ]; then
|
||||
echo -e "${RED}✗${NC} Found old package imports:"
|
||||
echo "$OLD_IMPORTS"
|
||||
ERRORS=$((ERRORS + 1))
|
||||
else
|
||||
echo -e "${GREEN}✓${NC} No old package imports found"
|
||||
fi
|
||||
echo ""
|
||||
|
||||
# Check 5: Verify storage service exists
|
||||
echo "📋 Checking storage service..."
|
||||
if [ -f "src/lib/storage-service.ts" ]; then
|
||||
echo -e "${GREEN}✓${NC} storage-service.ts exists"
|
||||
else
|
||||
echo -e "${RED}✗${NC} storage-service.ts is missing"
|
||||
ERRORS=$((ERRORS + 1))
|
||||
fi
|
||||
echo ""
|
||||
|
||||
# Check 6: Verify spark library exists
|
||||
echo "📋 Checking spark library..."
|
||||
if [ -f "src/lib/spark/index.ts" ]; then
|
||||
echo -e "${GREEN}✓${NC} spark/index.ts exists"
|
||||
else
|
||||
echo -e "${RED}✗${NC} spark/index.ts is missing"
|
||||
ERRORS=$((ERRORS + 1))
|
||||
fi
|
||||
echo ""
|
||||
|
||||
# Check 7: Verify useKV hook exists
|
||||
echo "📋 Checking useKV hook..."
|
||||
if [ -f "src/hooks/use-kv.ts" ]; then
|
||||
echo -e "${GREEN}✓${NC} use-kv.ts exists"
|
||||
else
|
||||
echo -e "${RED}✗${NC} use-kv.ts is missing"
|
||||
ERRORS=$((ERRORS + 1))
|
||||
fi
|
||||
echo ""
|
||||
|
||||
# Check 8: Verify StorageSettings component exists
|
||||
echo "📋 Checking StorageSettings component..."
|
||||
if [ -f "src/components/StorageSettings.tsx" ]; then
|
||||
echo -e "${GREEN}✓${NC} StorageSettings.tsx exists"
|
||||
else
|
||||
echo -e "${YELLOW}⚠${NC} StorageSettings.tsx not found (optional)"
|
||||
fi
|
||||
echo ""
|
||||
|
||||
# Summary
|
||||
echo "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━"
|
||||
if [ $ERRORS -eq 0 ]; then
|
||||
echo -e "${GREEN}✅ All checks passed!${NC}"
|
||||
echo ""
|
||||
echo "The packages folder can be safely removed:"
|
||||
echo " rm -rf packages"
|
||||
echo ""
|
||||
echo "Next steps:"
|
||||
echo " 1. Test the build: npm run build"
|
||||
echo " 2. Test Docker build: docker build -t codeforge ."
|
||||
echo " 3. Commit the changes"
|
||||
else
|
||||
echo -e "${RED}❌ $ERRORS error(s) found${NC}"
|
||||
echo "Please fix the errors above before removing the packages folder"
|
||||
exit 1
|
||||
fi
|
||||
Reference in New Issue
Block a user