FLOW MASON

Architecture Overview

Understanding FlowMason's layered architecture and execution model.

FlowMason uses a layered architecture designed for local-first development with production-ready deployment capabilities.

System Layers

flowchart TB
    subgraph UI["UI Layer"]
        VSCode["VSCode Extension"]
        Studio["Studio Frontend"]
    end

    subgraph API["API Layer"]
        REST["REST API"]
        WS["WebSocket"]
    end

    subgraph Core["Core Framework"]
        Executor["Execution Engine"]
        Registry["Component Registry"]
        Loader["Pipeline Loader"]
    end

    subgraph Components["Components"]
        Nodes["AI Nodes"]
        Operators["Operators"]
        ControlFlow["Control Flow"]
    end

    UI --> API
    API --> Core
    Core --> Components

Pipeline Execution Flow

When you run a pipeline, FlowMason executes stages in waves based on their dependencies:

flowchart LR
    subgraph Wave1["Wave 1"]
        A["fetch"]
        B["config"]
    end

    subgraph Wave2["Wave 2"]
        C["extract"]
    end

    subgraph Wave3["Wave 3"]
        D["summarize"]
        E["analyze"]
    end

    subgraph Wave4["Wave 4"]
        F["combine"]
    end

    A --> C
    B --> C
    C --> D
    C --> E
    D --> F
    E --> F

Wave-based execution means:

  • Stages with no dependencies run in parallel (Wave 1)
  • Stages wait for their dependencies to complete
  • Independent stages in later waves also run in parallel

Component Types

flowchart TD
    Component["Component"]
    Component --> Node["Node"]
    Component --> Operator["Operator"]
    Component --> Control["Control Flow"]

    Node --> |"Uses LLM"| AI["AI Processing"]
    Operator --> |"Deterministic"| Transform["Data Transform"]
    Control --> |"Directives"| Flow["Execution Logic"]

    AI --> Generator["generator"]
    AI --> Critic["critic"]
    AI --> Improver["improver"]

    Transform --> HTTP["http-request"]
    Transform --> JSON["json-transform"]
    Transform --> Filter["filter"]

    Flow --> Conditional["conditional"]
    Flow --> ForEach["foreach"]
    Flow --> TryCatch["trycatch"]

Data Flow

Data flows through stages via template expressions:

sequenceDiagram
    participant Input as Pipeline Input
    participant Stage1 as fetch
    participant Stage2 as extract
    participant Stage3 as summarize
    participant Output as Pipeline Output

    Input->>Stage1: {{input.url}}
    Stage1->>Stage2: {{stages.fetch.output.body}}
    Stage2->>Stage3: {{stages.extract.output.content}}
    Stage3->>Output: {{stages.summarize.output.summary}}

WebSocket Communication

Real-time updates flow through WebSocket connections:

sequenceDiagram
    participant Client as VSCode/Studio
    participant Server as Studio Backend
    participant Engine as Execution Engine

    Client->>Server: Subscribe to run
    Server->>Engine: Start execution

    loop Each Stage
        Engine->>Server: Stage started
        Server->>Client: stage:started event
        Engine->>Server: Stage completed
        Server->>Client: stage:completed event
    end

    Engine->>Server: Run completed
    Server->>Client: run:completed event

Debug Session Flow

stateDiagram-v2
    [*] --> Idle
    Idle --> Running: Start Debug (F5)
    Running --> Paused: Hit Breakpoint
    Paused --> Running: Continue (F5)
    Paused --> Stepping: Step Over (F10)
    Stepping --> Paused: Step Complete
    Paused --> Idle: Stop (Shift+F5)
    Running --> Idle: Run Complete

Hybrid Deployment Model

FlowMason follows a Salesforce DX-inspired hybrid model:

flowchart LR
    subgraph Local["Local Development"]
        Files["JSON Files"]
        SQLite["SQLite DB"]
    end

    subgraph Staging["Staging"]
        StagingDB["PostgreSQL"]
    end

    subgraph Production["Production"]
        ProdDB["PostgreSQL"]
        ProdAPI["API Gateway"]
    end

    Files -->|"fm deploy"| StagingDB
    StagingDB -->|"fm promote"| ProdDB
    ProdDB --> ProdAPI

Environment Progression:

EnvironmentStorageUse Case
LocalJSON filesDevelopment & testing
Local OrgSQLiteIntegration testing
StagingPostgreSQLQA & validation
ProductionPostgreSQLLive workloads

Error Handling

FlowMason uses a MuleSoft-inspired error classification:

flowchart TD
    Error["Error"]
    Error --> Connectivity["CONNECTIVITY"]
    Error --> Timeout["TIMEOUT"]
    Error --> Validation["VALIDATION"]
    Error --> Transform["TRANSFORM"]
    Error --> Security["SECURITY"]
    Error --> Expression["EXPRESSION"]

    Connectivity --> ConnTimeout["CONNECTIVITY:TIMEOUT"]
    Connectivity --> ConnRefused["CONNECTIVITY:REFUSED"]

    Validation --> ValInput["VALIDATION:INVALID_INPUT"]
    Validation --> ValSchema["VALIDATION:SCHEMA"]

    Security --> SecAuth["SECURITY:UNAUTHORIZED"]
    Security --> SecForbid["SECURITY:FORBIDDEN"]

Registry System

Components are discovered and registered automatically:

flowchart LR
    subgraph Discovery["Discovery"]
        Glob["Glob Patterns"]
        Decorators["@node, @operator"]
    end

    subgraph Registry["Component Registry"]
        Extract["Extractor"]
        Store["Registry Store"]
    end

    subgraph Usage["Usage"]
        Pipeline["Pipeline Loader"]
        Executor["Executor"]
    end

    Discovery --> Extract
    Extract --> Store
    Store --> Pipeline
    Pipeline --> Executor

Key Design Principles

  1. File-First Development - Pipelines are JSON files, enabling Git version control
  2. Type Safety - Python type hints with Pydantic validation
  3. Async-Native - Built on asyncio for efficient concurrent execution
  4. Provider Agnostic - Works with any LLM provider
  5. Observable - Full debugging and monitoring support