Default Application Workflow
This workflow package provides a comprehensive, production-ready workflow that combines backend initialization with an iterative AI agent loop. It demonstrates the "dogfooding" approach where AutoMetabuilder's own application logic is expressed as a declarative workflow.
Overview
The Default Application Workflow is a complete end-to-end workflow that:
- Bootstraps the backend - Loads all necessary configuration, clients, and tools
- Executes the AI loop - Runs the core AutoMetabuilder agent with tool calling capabilities
This workflow replaces the imperative Python code that was previously in app_runner.py, making the application logic:
- Declarative - Expressed as data (JSON) rather than code
- Visual - Can be visualized as a node graph
- Testable - Each node can be tested independently
- Modular - Easy to modify, extend, or replace nodes
Workflow Structure
Phase 1: Backend Bootstrap (9 nodes)
These nodes initialize all backend services and dependencies:
-
Load Messages (
backend.load_messages)- Loads internationalized translation messages
- Stores in
runtime.context["msgs"]
-
Load Metadata (
backend.load_metadata)- Loads
metadata.jsonconfiguration - Stores in
runtime.context["metadata"]
- Loads
-
Load Prompt (
backend.load_prompt)- Loads
prompt.ymlconfiguration - Resolves model name from environment or prompt
- Stores in
runtime.context["prompt"]andruntime.context["model_name"]
- Loads
-
Create GitHub Client (
backend.create_github)- Initializes GitHub API client
- Requires
GITHUB_TOKENenvironment variable - Stores in
runtime.context["gh"]
-
Create OpenAI Client (
backend.create_openai)- Initializes OpenAI/LLM client
- Uses GitHub token for authentication
- Stores in
runtime.context["client"]
-
Load Tools (
backend.load_tools)- Loads tool definitions from metadata
- Stores in
runtime.context["tools"]
-
Build Tool Map (
backend.build_tool_map)- Creates callable tool registry
- Maps tool names to implementations
- Stores in
runtime.context["tool_map"]
-
Load Plugins (
backend.load_plugins)- Loads any custom user plugins
- Registers them in the tool map
-
Load Tool Policies (
backend.load_tool_policies)- Loads tool execution policies
- Defines which tools require confirmation
- Stores in
runtime.context["tool_policies"]
Phase 2: AI Agent Loop (8 nodes)
These nodes execute the core AutoMetabuilder agent:
-
Load Context (
core.load_context)- Loads SDLC context (roadmap, issues, PRs)
- Provides situational awareness
-
Seed Messages (
core.seed_messages)- Initializes empty message array
- Prepares conversation state
-
Append Context (
core.append_context_message)- Adds SDLC context to messages
- Gives AI awareness of repository state
-
Append User Instruction (
core.append_user_instruction)- Adds user's task instruction
- Defines what the AI should accomplish
-
Main Loop (
control.loop)- Iterative execution controller
- Runs up to 10 iterations
- Stops when AI has no more tool calls
-
AI Request (
core.ai_request)- Sends messages to LLM
- Gets back response and optional tool calls
-
Run Tool Calls (
core.run_tool_calls)- Executes requested tool calls
- Handles confirmation prompts
- Returns results
-
Append Tool Results (
core.append_tool_results)- Adds tool results to messages
- Loops back to Main Loop for next iteration
Usage
This workflow is automatically loaded when you run AutoMetabuilder:
# Set in metadata.json
{
"workflow_path": "packages/default_app_workflow/workflow.json"
}
# Then run
autometabuilder
The app_runner.py module now simply:
- Loads environment and configuration
- Parses command line arguments
- Loads this workflow
- Executes it
Benefits of Workflow-Based Architecture
1. Separation of Concerns
- Backend initialization is isolated from AI logic
- Each phase can be tested independently
- Easy to add new initialization steps
2. Flexibility
- Swap out individual nodes without touching code
- Try different AI loop strategies
- Add monitoring or logging nodes
3. Observability
- Clear execution order
- Easy to trace data flow
- Can add debug nodes at any point
4. Extensibility
- Create variant workflows for different use cases
- Mix and match nodes from other packages
- Build custom workflows without code changes
Data Flow
Environment Variables (GITHUB_TOKEN, LLM_MODEL)
↓
Backend Bootstrap Phase
↓
runtime.context populated with:
- msgs (translations)
- metadata (config)
- prompt (agent instructions)
- model_name (LLM to use)
- gh (GitHub client)
- client (OpenAI client)
- tools (tool definitions)
- tool_map (callable tools)
- tool_policies (execution policies)
↓
AI Agent Loop Phase
↓
Iterative execution:
- Load SDLC context
- Send to LLM with tools
- Execute tool calls
- Append results
- Repeat until done
Customization
To create a custom variant:
-
Copy this package:
cp -r packages/default_app_workflow packages/my_custom_workflow -
Edit
workflow.json:- Add/remove nodes
- Change connections
- Modify parameters
-
Update
package.json:{ "name": "my_custom_workflow", "description": "My custom AutoMetabuilder workflow" } -
Update
metadata.json:{ "workflow_path": "packages/my_custom_workflow/workflow.json" }
Related Workflows
- backend_bootstrap - Just the initialization phase, useful for testing
- single_pass - One-shot AI request without iteration
- iterative_loop - Just the AI loop, assumes backend is initialized
- plan_execute_summarize - Multi-phase workflow with explicit planning
Technical Notes
Runtime Context vs Store
-
Context (
runtime.context): Immutable configuration and dependencies- Set once during bootstrap
- Available to all nodes
- Contains clients, tools, settings
-
Store (
runtime.store): Mutable execution state- Changes during execution
- Node outputs stored here
- Temporary working data
Plugin Responsibility
Backend workflow plugins (backend.*) have dual responsibility:
- Return result in output dict (for store)
- Update
runtime.contextdirectly (for downstream plugins)
This ensures both workflow data flow and imperative access work correctly.
Version History
- 1.0.0 - Initial release combining backend bootstrap and AI loop
- Replaces imperative
app_runner.pylogic - Enables "dogfooding" of workflow architecture
- 17 nodes total: 9 bootstrap + 8 AI loop
- Replaces imperative