FLOW MASON

Deployment Model

FlowMason's Salesforce DX-inspired hybrid deployment model for local development to production.

FlowMason uses a Salesforce DX-inspired hybrid model that combines file-based local development with database-backed production deployments.

The Hybrid Approach

flowchart LR
    subgraph Local["Local Development"]
        Files["JSON Files"]
        Git["Git Version Control"]
    end

    subgraph Staging["Staging Environment"]
        StagingDB[(PostgreSQL)]
        StagingAPI["API Server"]
    end

    subgraph Production["Production Environment"]
        ProdDB[(PostgreSQL)]
        ProdAPI["API Gateway"]
        ProdWorkers["Workers"]
    end

    Files --> Git
    Git -->|"fm deploy staging"| StagingDB
    StagingDB --> StagingAPI
    StagingAPI -->|"fm promote prod"| ProdDB
    ProdDB --> ProdAPI
    ProdDB --> ProdWorkers

Environment Comparison

AspectLocalLocal OrgStagingProduction
StorageJSON filesSQLitePostgreSQLPostgreSQL
PurposeDevelopmentIntegration testQA/ValidationLive workloads
Multi-userNoNoYesYes
Git trackedYesPartialNoNo
RollbackGit revertManualDatabaseDatabase

Development Workflow

1. Local Development

Work with JSON pipeline files directly:

# Create a new project
fm init my-project
cd my-project

# Edit pipelines as JSON files
code pipelines/my-pipeline.pipeline.json

# Run locally
fm run pipelines/my-pipeline.pipeline.json

2. Local Org Testing

Test with a local SQLite database to simulate production:

# Push to local org
fm push

# Run from local org
fm run my-pipeline --org local

# Pull changes back to files
fm pull

3. Deploy to Staging

# Configure staging environment
fm config set staging.url https://staging.flowmason.example.com
fm config set staging.api_key $STAGING_API_KEY

# Deploy to staging
fm deploy staging

# Run in staging
fm run my-pipeline --env staging

4. Promote to Production

# Promote from staging to production
fm promote prod

# Or deploy directly (not recommended)
fm deploy prod --force

Salesforce DX Command Mapping

If you’re familiar with Salesforce DX, here’s how FlowMason commands compare:

Salesforce DXFlowMasonDescription
sf project createfm initCreate new project
sf deploy metadatafm deployDeploy to org
sf retrieve metadatafm pullPull from org
sf apex runfm runExecute code
sf org create scratchfm org createCreate test org

CI/CD Integration

GitHub Actions

name: Deploy FlowMason Pipelines

on:
  push:
    branches: [main]
  pull_request:
    branches: [main]

jobs:
  test:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4
      - uses: actions/setup-python@v5
        with:
          python-version: '3.11'

      - name: Install FlowMason
        run: pip install flowmason

      - name: Run tests
        run: fm test --coverage

  deploy-staging:
    needs: test
    if: github.ref == 'refs/heads/main'
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4
      - uses: actions/setup-python@v5
        with:
          python-version: '3.11'

      - name: Install FlowMason
        run: pip install flowmason

      - name: Deploy to staging
        env:
          FM_STAGING_URL: ${{ secrets.STAGING_URL }}
          FM_STAGING_KEY: ${{ secrets.STAGING_API_KEY }}
        run: fm deploy staging

  deploy-prod:
    needs: deploy-staging
    if: github.ref == 'refs/heads/main'
    runs-on: ubuntu-latest
    environment: production
    steps:
      - uses: actions/checkout@v4
      - uses: actions/setup-python@v5
        with:
          python-version: '3.11'

      - name: Install FlowMason
        run: pip install flowmason

      - name: Promote to production
        env:
          FM_PROD_URL: ${{ secrets.PROD_URL }}
          FM_PROD_KEY: ${{ secrets.PROD_API_KEY }}
        run: fm promote prod

Deployment Architecture

flowchart TB
    subgraph Developer["Developer Machines"]
        D1["Developer 1"]
        D2["Developer 2"]
        D3["Developer 3"]
    end

    subgraph VCS["Version Control"]
        GitHub["GitHub/GitLab"]
        CI["CI/CD Pipeline"]
    end

    subgraph Cloud["Cloud Infrastructure"]
        LB["Load Balancer"]
        API1["API Server 1"]
        API2["API Server 2"]
        DB[(PostgreSQL)]
        Redis["Redis Cache"]
        Workers["Worker Pool"]
    end

    D1 --> GitHub
    D2 --> GitHub
    D3 --> GitHub
    GitHub --> CI
    CI -->|deploy| LB
    LB --> API1
    LB --> API2
    API1 --> DB
    API2 --> DB
    API1 --> Redis
    API2 --> Redis
    DB --> Workers

Environment Configuration

Local Configuration

// flowmason.json
{
  "name": "my-project",
  "version": "1.0.0",
  "environments": {
    "local": {
      "database": "sqlite:///flowmason.db"
    },
    "staging": {
      "url": "${FM_STAGING_URL}",
      "api_key": "${FM_STAGING_KEY}"
    },
    "production": {
      "url": "${FM_PROD_URL}",
      "api_key": "${FM_PROD_KEY}"
    }
  }
}

Secret Management

FlowMason supports multiple secret backends:

# Environment variables (default)
export FM_API_KEY=sk-xxx

# AWS Secrets Manager
fm config set secrets.backend aws
fm config set secrets.aws_region us-east-1

# HashiCorp Vault
fm config set secrets.backend vault
fm config set secrets.vault_addr https://vault.example.com

Rollback Strategies

From Files (Local)

# Revert to previous commit
git revert HEAD
fm push

From Database (Production)

# List deployment history
fm history --env prod

# Rollback to specific version
fm rollback v1.2.3 --env prod

# Rollback to previous deployment
fm rollback --previous --env prod

Multi-Tenant Deployment

For enterprise deployments, FlowMason supports multi-tenancy:

flowchart TB
    subgraph Tenants["Tenant Isolation"]
        T1["Tenant A"]
        T2["Tenant B"]
        T3["Tenant C"]
    end

    subgraph Shared["Shared Infrastructure"]
        API["API Gateway"]
        Auth["Auth Service"]
    end

    subgraph Isolated["Isolated Resources"]
        DB1[(Database A)]
        DB2[(Database B)]
        DB3[(Database C)]
    end

    T1 --> API
    T2 --> API
    T3 --> API
    API --> Auth
    Auth --> DB1
    Auth --> DB2
    Auth --> DB3

Tenant Configuration

# config/tenants.yaml
tenants:
  acme-corp:
    database: postgresql://acme:[email protected]/acme
    quota:
      pipelines: 100
      executions_per_day: 10000
    features:
      - advanced-debugging
      - custom-components

  startup-inc:
    database: postgresql://startup:[email protected]/startup
    quota:
      pipelines: 20
      executions_per_day: 1000

Best Practices

  1. Never deploy directly to production - Always go through staging first
  2. Use environment variables - Keep secrets out of version control
  3. Automate deployments - Use CI/CD for consistent deployments
  4. Monitor rollout - Watch metrics during deployment
  5. Keep environments in sync - Staging should mirror production
  6. Version your pipelines - Use semantic versioning for pipeline changes