Back to Capabilities

Workflow Automation

JSON-based workflow definitions with 7 node types, Temporal orchestration, and human approval gates for automated business processes.

Imagine...

Every Monday morning you spend two hours pulling metrics from your database, making charts in Excel, and emailing the team. Every single week. It's tedious, it's error-prone, and you forget half the time.

What if it just... happened? Automatically. Perfectly. Every time.

You

"Every Monday at 8am, check last week's revenue. If it dropped more than 10%, alert me on Slack with the top 10 customers and their recent activity. Otherwise, just post a summary to #metrics."

Your Awareness

*Creates workflow: Schedule trigger → Query revenue → Condition check → Slack notification*

"Done. The workflow is active. It'll run every Monday at 8am. Want to test it now to see what it looks like?"

You

"Yes. And can you also add a step where if revenue dropped, it automatically creates a customer health dashboard so I can investigate?"

Your Awareness

*Adds dashboard generation node to workflow, runs test execution*

"Updated. Test complete—here's your Slack alert and the auto-generated dashboard. Looks good?"

This isn't magic. This is collaboration.

Describe what should happen automatically. Your Awareness builds the workflow. You never manually pull metrics again.

Workflow Nodes

Action Node

Execute service operations

  • • Postgres (query, insert, update, delete)
  • • Neo4j (Cypher queries)
  • • Qdrant (vector search, upsert)
  • • HTTP (REST API calls)
  • • LLM (completion, embedding)
  • • Redis, Email, MinIO, Transform

Condition Node

Branch based on conditions

  • • Comparison operators (eq, gt, lt, etc.)
  • • String matching (contains, matches)
  • • Existence checks
  • • True/false branch routing

Loop Node

Iterate over data

  • • forEach (array iteration)
  • • while (condition-based)
  • • times (fixed count)
  • • Parallel execution with batch control

Parallel Node

Execute branches concurrently

  • • Multiple independent branches
  • • Wait for all / any / N completions
  • • Error isolation per branch

Code Node

Custom JavaScript execution

  • • Sandboxed with isolated-vm
  • • 128 MB memory limit
  • • 10 second timeout
  • • Access to workflow context

Wait Node

Pause execution

  • • Duration (ISO 8601, e.g., PT1H)
  • • Until timestamp
  • • Signal (wait for external event)
  • • Configurable timeout

Human Node

Human-in-the-loop approvals

  • • Task assignment to users/roles
  • • Custom form input
  • • Quick action buttons
  • • Escalation on timeout

Subworkflow Node

Compose workflows

  • • Reusable workflow patterns
  • • Input/output mapping
  • • Async or blocking execution

Workflow Triggers

Manual Trigger

User-initiated execution with optional input schema

{
  "type": "manual",
  "inputSchema": {
    "type": "object",
    "properties": {
      "email": { "type": "string" },
      "count": { "type": "number" }
    }
  }
}

Schedule Trigger

Time-based execution with cron or interval

{
  "type": "schedule",
  "cron": "0 9 * * 1",  // Mon at 9am
  "timezone": "Europe/London"
}

// OR

{
  "type": "schedule",
  "interval": "PT1H"  // Every hour
}

Webhook Trigger

HTTP endpoint for external integrations

{
  "type": "webhook",
  "path": "github/push",
  "method": "POST",
  "secret": "hmac_secret"
}

// Endpoint:
// POST /api/webhooks/workflows/{id}/github/push

Event Trigger

Internal event-driven execution

{
  "type": "event",
  "eventType": "file.uploaded",
  "source": "space_123"
}

Template Variables

Access data from previous nodes, trigger input, and workflow variables using {{expression}} syntax.

// Access node outputs
{{node_1.output.userId}}
{{http_fetch.output.body.results[0].id}}

// Access trigger data
{{trigger.email}}
{{trigger.payload.repository.full_name}}

// Access workflow variables
{{variables.apiKey}}
{{variables.config.baseUrl}}

// Operators
{{node_1.output.count > 100}}
{{node_2.output.status == "success"}}

// Array operations
{{node_list.output.items.length}}
{{node_list.output.items[0].name}}

// Example in HTTP action:
{
  "service": "http",
  "operation": "request",
  "config": {
    "url": "{{variables.apiUrl}}/users/{{node_1.output.userId}}",
    "method": "GET",
    "headers": {
      "Authorization": "Bearer {{variables.apiKey}}"
    }
  }
}

Example: Data ETL Pipeline

{
  "name": "Daily User Export",
  "description": "Extract users from Postgres, transform, and load to S3",
  "trigger": {
    "type": "schedule",
    "cron": "0 2 * * *",  // 2am daily
    "timezone": "UTC"
  },
  "nodes": [
    {
      "id": "extract",
      "type": "action",
      "name": "Extract Users",
      "action": {
        "service": "postgres",
        "operation": "query",
        "config": {
          "query": "SELECT * FROM users WHERE created_at >= NOW() - INTERVAL '1 day'"
        }
      }
    },
    {
      "id": "transform",
      "type": "code",
      "name": "Transform Data",
      "language": "javascript",
      "sandbox": true,
      "code": "return input.rows.map(u => ({ id: u.id, email: u.email.toLowerCase(), active: u.status === 'active' }));"
    },
    {
      "id": "check_count",
      "type": "condition",
      "name": "Check if any users",
      "condition": {
        "left": "{{transform.output.length}}",
        "operator": "gt",
        "right": 0
      },
      "trueBranch": "upload",
      "falseBranch": "notify_empty"
    },
    {
      "id": "upload",
      "type": "action",
      "name": "Upload to S3",
      "action": {
        "service": "minio",
        "operation": "upload",
        "config": {
          "bucket": "exports",
          "key": "users/{{trigger.timestamp}}.json",
          "content": "{{JSON.stringify(transform.output)}}"
        }
      }
    },
    {
      "id": "notify_empty",
      "type": "action",
      "name": "Notify Empty",
      "action": {
        "service": "email",
        "operation": "send",
        "config": {
          "to": "admin@example.com",
          "subject": "No users to export",
          "body": "Daily export found 0 new users"
        }
      }
    }
  ],
  "edges": [
    { "id": "e1", "from": "extract", "to": "transform" },
    { "id": "e2", "from": "transform", "to": "check_count" }
  ]
}

Temporal Orchestration

Workflows are executed by Temporal, providing:

  • Durability - Workflows survive crashes and restarts
  • Retries - Automatic retry with exponential backoff
  • Signals - External events can trigger workflow actions
  • Versioning - Update workflows without breaking running executions
  • Observability - Full execution history and tracing

Namespace isolation ensures workflows in different spaces cannot interfere with each other.

Available Tools

list_processes

List all workflow definitions

create_process

Create new workflow from JSON

execute_process

Trigger workflow execution

get_execution

Get execution status and outputs

cancel_execution

Cancel running workflow