ROADMAP.md

This commit is contained in:
2026-01-10 00:23:27 +00:00
parent 3a5c6facdb
commit 606f2b40e4
17 changed files with 246 additions and 49 deletions

View File

@@ -22,6 +22,7 @@ SDL3 + bgfx demo app with Lua-driven runtime configuration, audio playback, and
- Linux/macOS: `source build/conanrun.sh`
- Windows (cmd.exe): `build\conanrun.bat`
5. `python scripts/dev_commands.py run`
6. Use the guardrail prefix helper before invoking Codex if you want to prepend standard instructions: `python scripts/auto_prompt_prefixer.py --prompt "Describe the package layout"`. It defaults to `config/prompt_prefix.txt`, or pass `--prefix-file`/`--prefix-text` to change it.
## Build helper commands
- `python scripts/dev_commands.py configure` uses Ninja by default (Ninja+MSVC on Windows) and writes to `build-ninja`/`build-ninja-msvc`; override with `--generator` and `--build-dir`.

View File

@@ -86,7 +86,7 @@ Treat JSON config as a declarative control plane that compiles into scene, resou
- Deliverable: app boot always compiles config and prefers IR-derived data.
- Acceptance: running with only a JSON config triggers IR compilation, and Lua scene load only happens if explicitly enabled.
- Introduce an npm-like dependency manifest per package plus a top-level `packages/registry.json`, allowing packages to declare other packages (e.g., `materialx`) they rely on before their workflows run; this keeps each workflow self-contained and enforces the “only register what the workflow references” rule.
- Introduce an npm-like dependency manifest per package and rely on the `packages/` directory (resolved through `scripts/package_lint.py`) so packages can declare dependencies (e.g., `materialx`) before their workflows run; this keeps each workflow self-contained and enforces the “only register what the workflow references” rule.
- Current catalog entries: `packages/seed` (spinning cube), `packages/gui` (GUI panels), `packages/soundboard`, `packages/quake3`, `packages/engine_tester`, and `packages/materialx`; each package lists its dependency graph and workflow entry point so a loader can resolve them in dependency order.
### Phase 1: Schema Extensions For Config-First Runtime (2-4 days)
@@ -395,7 +395,7 @@ Option B: per-shader only
### Manifest Expectations
- Treat each package like an npm module: `package.json` + `workflows/` folder + clear `assets/`, `scene/` (not “levels”), optional `shaders/`, `gui/`, and `assets/sound` sub-folders so editors, artists, and automation can find the data without guessing.
- Include `defaultWorkflow`, `workflows`, `assets`, `scene`, and `dependencies` fields in `package.json`; bundle notes, template guidance, and a `bundled` flag for platform-specific exports. A package linter will scan these manifests and warn when fields are missing, workflows are orphaned, dependencies are unspecified, or an active workflow lacks an associated C++ step registry entry.
- Keep the `packages/registry.json` catalog in sync with the manifest layer so workflow loaders can resolve dependencies (e.g., `materialx`) before executing the JSON control plane. Packages that list unused services should emit warnings at lint time—if the workflow does not run it, the service should not register itself or consume startup budget.
- Let `scripts/package_lint.py` scan `packages/` for `package.json` manifests so workflow loaders can validate dependencies (e.g., `materialx`) before executing the JSON control plane. Packages that list unused services should emit lint warnings and remain dormant unless a workflow references them.
### Asset & Vendor Hygiene
- Copy static assets from `MaterialX/` and the legacy `config/` asset folders (poster textures, fonts, audio catalogs, procedural generator outputs like `scripts/generate_audio_assets.py` and `scripts/generate_cube_stl.py`) into the appropriate `packages/<name>/assets/` subfolders. When a package owns enough copies, the on-disk `MaterialX` depot becomes optional; treat it as historical/archival until the workflow-only path is exercised.
@@ -430,7 +430,7 @@ Option B: per-shader only
- Build a package linter that runs as part of `scripts/lint.sh` (or a dedicated CI job) and flags:
- missing `defaultWorkflow` or `workflow` definitions that cover boot/frame phases,
- absent `assets`/`scene` references that the workflow expects,
- dependencies listed in `registry.json` but not declared in `package.json`,
- - dependencies pointing at packages or directories that do not exist,
- unused services or steps referenced by the workflow but lacking C++ counterparts.
- When we repackage an existing demo, the linter should compare the legacy config/workflow + assets (e.g., the old Lua-driven bundle) to the restored package and warn about any missing pieces (assets, scenes, workflows, or service steps) so we can see what still needs to be ported.
- The linter also ensures that packages import the right assets (GUI folder under `gui/`, sound under `assets/sound`, fonts under `assets/fonts`, shaders under `shaders/`) so the runtime can find them deterministically.

View File

@@ -12,8 +12,7 @@
],
"scene": [
"scene/teleport_points.json"
]
,
],
"dependencies": [
"materialx"
]

View File

@@ -18,8 +18,7 @@
],
"shaders": [
"shaders/gui_font.json"
]
,
],
"dependencies": [
"materialx"
]

View File

@@ -15,8 +15,7 @@
],
"shaders": [
"shaders/quake3_glsl.json"
]
,
],
"dependencies": [
"materialx"
]

View File

@@ -1,40 +0,0 @@
{
"packages": [
{
"name": "seed",
"path": "packages/seed",
"workflow": "workflows/demo_gameplay.json",
"dependencies": ["materialx"]
},
{
"name": "gui",
"path": "packages/gui",
"workflow": "workflows/gui_frame.json",
"dependencies": ["materialx"]
},
{
"name": "soundboard",
"path": "packages/soundboard",
"workflow": "workflows/soundboard_flow.json",
"dependencies": ["materialx"]
},
{
"name": "quake3",
"path": "packages/quake3",
"workflow": "workflows/quake3_frame.json",
"dependencies": ["materialx"]
},
{
"name": "engine_tester",
"path": "packages/engine_tester",
"workflow": "workflows/validation_tour.json",
"dependencies": ["materialx"]
},
{
"name": "materialx",
"path": "packages/materialx",
"workflow": null,
"dependencies": []
}
]
}

239
scripts/package_lint.py Normal file
View File

@@ -0,0 +1,239 @@
#!/usr/bin/env python3
"""
Lightweight package validator that walks the `packages/` tree for all `package.json` files,
checks their npm-style schema, validates referenced assets/workflows/shaders/scenes, and logs
missing folders and schema violations.
"""
from __future__ import annotations
import argparse
import json
import logging
import sys
from pathlib import Path
from typing import Callable, Iterable, Optional, Sequence
COMMON_FOLDERS = ("assets", "scene", "shaders", "workflows")
REQUIRED_FIELDS = ("name", "version", "description", "workflows", "defaultWorkflow")
FIELD_TO_FOLDER = {
"assets": "assets",
"scene": "scene",
"shaders": "shaders",
"workflows": "workflows",
}
logger = logging.getLogger("package_lint")
try:
from jsonschema import Draft7Validator
except ImportError:
Draft7Validator = None
def load_json(path: Path) -> dict:
logger.debug("Reading JSON from %s", path)
with path.open("r", encoding="utf-8") as handle:
return json.load(handle)
def check_paths(
root: Path,
entries: Iterable[str],
key: str,
on_exist: Optional[Callable[[Path, str], None]] = None,
) -> Sequence[str]:
"""Return list of missing files for the given key list and optionally call `on_exist` for existing items."""
missing = []
for rel in entries:
if not isinstance(rel, str):
missing.append(f"{rel!r} (not a string)")
continue
candidate = root / rel
logger.debug("Checking %s entry %s", key, candidate)
if not candidate.exists():
missing.append(str(rel))
continue
if on_exist:
on_exist(candidate, rel)
return missing
def validate_workflow_schema(workflow_path: Path, validator) -> list[str]:
"""Validate a workflow JSON file against the provided schema validator."""
try:
content = load_json(workflow_path)
except json.JSONDecodeError as exc:
return [f"invalid JSON: {exc}"]
issues: list[str] = []
for err in sorted(
validator.iter_errors(content),
key=lambda x: tuple(x.absolute_path),
):
pointer = "/".join(str(part) for part in err.absolute_path) or "<root>"
issues.append(f"schema violation at {pointer}: {err.message}")
return issues
def validate_package(
pkg_root: Path,
pkg_data: dict,
registry_names: Sequence[str],
available_dirs: Sequence[str],
workflow_schema_validator: Optional["Draft7Validator"] = None,
) -> tuple[list[str], list[str]]:
errors: list[str] = []
warnings: list[str] = []
logger.debug("Validating %s", pkg_root)
for field in REQUIRED_FIELDS:
if field not in pkg_data:
errors.append(f"missing required field `{field}`")
workflows = pkg_data.get("workflows")
default_workflow = pkg_data.get("defaultWorkflow")
if workflows and isinstance(workflows, list):
if default_workflow and default_workflow not in workflows:
errors.append("`defaultWorkflow` is not present in `workflows` array")
# schema-like validations
for key in ("workflows", "assets", "scene", "shaders"):
value = pkg_data.get(key)
if value is None:
continue
if not isinstance(value, list):
errors.append(f"`{key}` must be an array if present")
continue
on_exist: Optional[Callable[[Path, str], None]] = None
if key == "workflows" and workflow_schema_validator:
def on_exist(candidate: Path, rel: str) -> None:
schema_issues = validate_workflow_schema(candidate, workflow_schema_validator)
for issue in schema_issues:
errors.append(f"workflow `{rel}`: {issue}")
missing = check_paths(pkg_root, value, key, on_exist=on_exist)
if missing:
warnings.append(f"{key} entries not found: {missing}")
# dependencies validation
deps = pkg_data.get("dependencies", [])
if deps and not isinstance(deps, list):
errors.append("`dependencies` must be an array")
else:
known_names = set(registry_names)
known_names.update(available_dirs)
for dep in deps:
if dep not in known_names:
warnings.append(f"dependency `{dep}` is not known in registry")
# common folder existence
for field, folder in FIELD_TO_FOLDER.items():
entries = pkg_data.get(field) or []
if entries and not (pkg_root / folder).exists():
warnings.append(f"common folder `{folder}` referenced but missing")
return errors, warnings
def main() -> int:
parser = argparse.ArgumentParser(description="Validate package metadata and assets.")
parser.add_argument(
"--packages-root",
type=Path,
default=Path("packages"),
help="Root folder containing package directories",
)
parser.add_argument(
"--workflow-schema",
type=Path,
help="Optional workflow JSON schema (default: config/schema/workflow_v1.schema.json when available)",
)
parser.add_argument(
"--verbose",
action="store_true",
help="Enable debug logging for tracing validation steps",
)
args = parser.parse_args()
logging.basicConfig(
format="%(levelname)s: %(message)s",
level=logging.DEBUG if args.verbose else logging.INFO,
)
if not args.packages_root.exists():
logger.error("packages root %s does not exist", args.packages_root)
return 2
schema_candidate = args.workflow_schema
default_schema = Path("config/schema/workflow_v1.schema.json")
if schema_candidate is None and default_schema.exists():
schema_candidate = default_schema
workflow_validator: Optional["Draft7Validator"] = None
if schema_candidate:
if not schema_candidate.exists():
logger.error("specified workflow schema %s not found", schema_candidate)
return 5
try:
workflow_schema = load_json(schema_candidate)
except json.JSONDecodeError as exc:
logger.error("invalid JSON schema %s: %s", schema_candidate, exc)
return 6
if Draft7Validator is None:
logger.warning("jsonschema dependency not installed; skipping workflow schema validation")
else:
try:
workflow_validator = Draft7Validator(workflow_schema)
except Exception as exc:
logger.error("failed to compile workflow schema %s: %s", schema_candidate, exc)
return 7
package_dirs = [
child
for child in sorted(args.packages_root.iterdir())
if child.is_dir() and (child / "package.json").exists()
]
if not package_dirs:
logger.warning("no package directories with package.json found under %s", args.packages_root)
loaded_packages: list[tuple[Path, dict]] = []
summary_errors = 0
summary_warnings = 0
for pkg_root in package_dirs:
pkg_json_file = pkg_root / "package.json"
try:
pkg_data = load_json(pkg_json_file)
except json.JSONDecodeError as exc:
logger.error("invalid JSON in %s: %s", pkg_json_file, exc)
summary_errors += 1
continue
loaded_packages.append((pkg_root, pkg_data))
registry_names = [
pkg_data.get("name")
for _, pkg_data in loaded_packages
if isinstance(pkg_data.get("name"), str)
]
available_dirs = [entry.name for entry in args.packages_root.iterdir() if entry.is_dir()]
for pkg_root, pkg_data in loaded_packages:
pkg_json_file = pkg_root / "package.json"
errors, warnings = validate_package(
pkg_root,
pkg_data,
registry_names,
available_dirs,
workflow_validator,
)
for err in errors:
logger.error("%s: %s", pkg_json_file, err)
for warn in warnings:
logger.warning("%s: %s", pkg_json_file, warn)
summary_errors += len(errors)
summary_warnings += len(warnings)
logger.info("lint complete: %d errors, %d warnings", summary_errors, summary_warnings)
return 1 if summary_errors else 0
if __name__ == "__main__":
sys.exit(main())