Skip to main content

Batch Data

Controlled node

Overview

The Batch Data node retrieves paginated data from a project database table when running workflows in batch mode. It is designed to work with Intellectible's batch processing system, allowing workflows to process large datasets in manageable chunks.

When testing in the workflow editor, the node provides a UI to simulate batch data by selecting a database, table, and configuring test pagination parameters. In production runtime, it automatically uses the batch configuration defined at the workflow level.

Batch Processing Context

This node is typically used within workflows configured for batch execution, where the workflow runs multiple times—once for each batch of data. The node automatically handles pagination using the batch index and size defined in the workflow's batch options.

Inputs

InputTypeDescriptionDefault
RunEventTriggers the node to fetch the current batch of data.-

Outputs

OutputTypeDescription
DoneEventFires when the database query has completed successfully.
DataDataAn array of records from the database table for the current batch. Returns an empty array if no data is found or if batch options are not configured.

Runtime Behavior and Defaults

Editor Mode (Testing)

When running in the workflow editor, the node uses test configuration parameters set in the node's properties panel:

  • Test Database: Select a project database to query
  • Test Table: Select a specific table from the chosen database
  • Batch Index (Limit): The number of records to retrieve per batch (default: 10)
  • Batch Size (Offset): The starting offset for the current batch (default: 0)

The node constructs a SQL query using these parameters:

SELECT * FROM {table} ORDER BY intellectible_id ASC LIMIT {testBatchIndex} OFFSET {testBatchSize}

Production Runtime

When the workflow runs in batch mode (outside the editor), the node reads batch configuration from the workflow's batchOpts:

  • batchIndex: The current batch iteration number
  • batchSize: The number of records per batch
  • batchInputDatabase: The database ID to query
  • batchInputTable: The table name to query

The offset is calculated as batchIndex * batchSize, and the query follows the same pattern as in editor mode.

Error Handling

If the project ID cannot be determined, or if batch options are not properly configured in production, the node returns an empty array for the data output.

Example Usage

Basic Batch Processing Setup

  1. Configure your workflow to use batch input (set at the workflow level)
  2. Add a Batch Data node to retrieve the data for each iteration
  3. Connect the Data output to downstream nodes (e.g., AI Write, Database operations)
  4. Use the Done event to trigger processing of the retrieved batch

Testing Configuration

When building your workflow:

  1. Select a Test Database and Test Table in the node's panel UI
  2. Set Batch Index (Limit) to control how many records to fetch per test run (e.g., 10)
  3. Set Batch Size (Offset) to control the starting position (e.g., 0 for the first batch)
  4. Run the workflow to test with a subset of your data before deploying

Workflow Pattern

Start → Batch Data → [Process Records] → [Next Batch Logic] → Done

The Batch Data node works in conjunction with other batch utility nodes like Batch Index, Batch Length, and Batch Size to create complete batch processing workflows.