DatabaseClient SDK
The DatabaseClient SDK provides secure access to your project's PostgreSQL databases from within code nodes. Create tables, insert data, query records, and manage your database schema programmatically.
Overview
The Database Client is automatically available in all code nodes and requires no additional configuration. It uses secure execution tokens that are scoped to your project.
Key Features:
- List databases and tables
- Create and manage tables
- Define columns with types and constraints
- Insert, update, and delete rows
- Query data with SQL
- Bulk CSV imports
Installation
The SDK is pre-installed in all code execution environments. No installation required.
Code Structure
Code nodes run as standalone scripts, so you need to structure your code correctly:
JavaScript — await cannot be used at the top level. Wrap your code in an async function:
const { DatabaseClient } = require('@intellectible/execution-sdk');
async function main() {
const database = new DatabaseClient();
const databases = await database.listDatabases();
// ... your logic here
}
main().catch(err => console.error(err));
Python — SDK methods are synchronous (no async/await needed). Use a main() function with __main__ guard:
from intellectible_execution import DatabaseClient
def main():
database = DatabaseClient()
databases = database.list_databases()
# ... your logic here
if __name__ == '__main__':
main()
All method examples below show just the SDK calls for brevity. In your code node, wrap them inside main() as shown above.
Quick Reference
JavaScript
const { DatabaseClient } = require('@intellectible/execution-sdk');
const database = new DatabaseClient();
| Method | Returns | Description |
|---|---|---|
listDatabases() | {id, name, createdDate}[] | List all databases |
listTables(databaseId) | string[] | List tables in a database |
createTable(databaseId, tableId) | string | Create a table |
deleteTable(databaseId, tableId) | void | Delete a table |
createColumn(databaseId, tableId, colId, colType, options?) | string | Add a column |
deleteColumn(databaseId, tableId, colId) | void | Delete a column |
query(databaseId, tableId, sqlQuery) | Object[] | Run SQL query, returns matching rows |
insertRows(databaseId, tableId, rows) | string[] | Insert rows, returns row IDs |
updateCell(databaseId, tableId, rowId, colId, newValue) | void | Update a cell |
deleteRows(databaseId, tableId, rowIds) | void | Delete rows |
uploadCSV(databaseId, tableId, docId) | object | Import CSV from library |
Python
from intellectible_execution import DatabaseClient
database = DatabaseClient()
| Method | Returns | Description |
|---|---|---|
list_databases() | List[Dict] | List all databases |
list_tables(database_id) | List[str] | List tables in a database |
create_table(database_id, table_id) | str | Create a table |
delete_table(database_id, table_id) | None | Delete a table |
create_column(database_id, table_id, col_id, col_type, ...) | str | Add a column |
delete_column(database_id, table_id, col_id) | None | Delete a column |
query(database_id, table_id, sql_query) | List[Dict] | Run SQL query, returns matching rows |
insert_rows(database_id, table_id, rows) | List[str] | Insert rows, returns row IDs |
update_cell(database_id, table_id, row_id, col_id, new_value) | None | Update a cell |
delete_rows(database_id, table_id, row_ids) | None | Delete rows |
upload_csv(database_id, table_id, doc_id) | Dict | Import CSV from library |
JavaScript SDK
Import
const { DatabaseClient } = require('@intellectible/execution-sdk');
const database = new DatabaseClient();
Database Operations
listDatabases()
List all databases in the project.
Returns: Promise<Array<{id: string, name: string, createdDate: string}>>
Example:
const databases = await database.listDatabases();
console.log(`Found ${databases.length} databases`);
databases.forEach(db => {
console.log(`- ${db.name} (ID: ${db.id})`);
});
Table Operations
listTables(databaseId)
List all tables in a database.
Parameters:
databaseId(string, required): Database ID
Returns: Promise<Array<string>> - Array of table names
Example:
const tables = await database.listTables(dbId);
console.log(`Found ${tables.length} tables`);
tables.forEach(table => {
console.log(`- ${table}`);
});
createTable(databaseId, tableId)
Create a new table.
Parameters:
databaseId(string, required): Database IDtableId(string, required): Table name/ID
Returns: Promise<string> - Table ID
Example:
const tableId = await database.createTable(dbId, 'users');
console.log('Table created:', tableId);
Notes:
- Table starts with no columns
- Add columns using
createColumn()
deleteTable(databaseId, tableId)
Delete a table and all its data.
Parameters:
databaseId(string, required): Database IDtableId(string, required): Table ID to delete
Returns: Promise<void>
Example:
await database.deleteTable(dbId, 'old_users');
console.log('Table deleted');
Warning: This permanently deletes all rows.
Column Operations
createColumn(databaseId, tableId, colId, colType, options)
Add a column to a table.
Parameters:
databaseId(string, required): Database IDtableId(string, required): Table IDcolId(string, required): Column namecolType(string, required): Column type (see types below)options(object, optional):colTypeParam(any): Type-specific parameter (e.g., VARCHAR length)colIsUnique(boolean): Enforce uniqueness constraintcolIsIndexed(boolean): Create index for faster queries
Column Types:
text- Text/string datanumber- Numeric data with arbitrary precisionboolean- True/false valuesvector- pgvector embeddings (usecolTypeParamfor dimension, 1-2000, default 1024)json- JSON datajsonb- Binary JSON data (faster queries, recommended)tsvector- Full-text search vectors
Returns: Promise<string> - Column ID
Examples:
// Text column
await database.createColumn(dbId, 'users', 'name', 'text');
// Email with unique constraint
await database.createColumn(dbId, 'users', 'email', 'text', {
colIsUnique: true,
colIsIndexed: true
});
// Numeric age
await database.createColumn(dbId, 'users', 'age', 'number');
// Boolean active flag
await database.createColumn(dbId, 'users', 'is_active', 'boolean');
// JSON metadata (use jsonb for better performance)
await database.createColumn(dbId, 'users', 'metadata', 'jsonb');
// Vector embeddings (1536 dimensions for OpenAI embeddings)
await database.createColumn(dbId, 'documents', 'embedding', 'vector', {
colTypeParam: 1536
});
deleteColumn(databaseId, tableId, colId)
Remove a column from a table.
Parameters:
databaseId(string, required): Database IDtableId(string, required): Table IDcolId(string, required): Column ID to delete
Returns: Promise<void>
Example:
await database.deleteColumn(dbId, 'users', 'old_field');
console.log('Column deleted');
Warning: This permanently deletes all data in the column.
Row Operations
query(databaseId, tableId, sqlQuery)
Run a SQL query against a table. Returns matching rows.
Parameters:
databaseId(string, required): Database IDtableId(string, required): Table IDsqlQuery(string, required): SQL query to execute
Returns: Promise<Object[]> - Array of row objects
Example:
// Query all rows (with limit)
const rows = await database.query(dbId, 'users', 'SELECT * FROM "users" LIMIT 20');
console.log(`Found ${rows.length} rows`);
rows.forEach(row => {
console.log(`User: ${row.name}, Email: ${row.email}`);
});
// Filter with WHERE clause
const activeUsers = await database.query(dbId, 'users',
'SELECT * FROM "users" WHERE "is_active" = true'
);
// Aggregate queries
const stats = await database.query(dbId, 'users',
'SELECT COUNT(*) as total, AVG("age") as avg_age FROM "users"'
);
- Always wrap table and column names in double quotes (
"name") - Use single quotes for string values (
'value') - Standard PostgreSQL SQL syntax is supported
insertRows(databaseId, tableId, rows)
Insert one or more rows into a table.
Parameters:
databaseId(string, required): Database IDtableId(string, required): Table IDrows(Object[], required): Array of row objects with column values as key-value pairs
Returns: Promise<string[]> - Array of row IDs
Example:
// Insert a single row
const [rowId] = await database.insertRows(dbId, 'users', [{
name: 'Alice',
email: 'alice@example.com',
age: 30,
created_at: new Date().toISOString()
}]);
console.log('Row inserted:', rowId);
// Insert multiple rows at once
const rowIds = await database.insertRows(dbId, 'users', [
{ name: 'Bob', email: 'bob@example.com', age: 25 },
{ name: 'Charlie', email: 'charlie@example.com', age: 35 }
]);
console.log('Rows inserted:', rowIds);
Notes:
- Provide values for all non-nullable columns
- JSON values must be stringified
- Timestamps should be ISO 8601 format
updateCell(databaseId, tableId, rowId, colId, newValue)
Update a single cell value.
Parameters:
databaseId(string, required): Database IDtableId(string, required): Table IDrowId(string, required): Row ID to updatecolId(string, required): Column ID to updatenewValue(any, required): New value for the cell
Returns: Promise<void>
Example:
// Update user's age
await database.updateCell(dbId, 'users', rowId, 'age', 31);
// Update JSON metadata
const metadata = { source: 'manual', verified: true };
await database.updateCell(dbId, 'users', rowId, 'metadata', JSON.stringify(metadata));
console.log('Cell updated');
deleteRows(databaseId, tableId, rowIds)
Delete multiple rows.
Parameters:
databaseId(string, required): Database IDtableId(string, required): Table IDrowIds(Array<string>, required): Array of row IDs to delete
Returns: Promise<void>
Example:
// Delete specific rows
await database.deleteRows(dbId, 'users', ['row_1', 'row_3', 'row_5']);
// Query and delete old records
const rows = await database.query(dbId, 'logs',
'SELECT * FROM "logs" WHERE "created_at" < \'2025-01-01\''
);
const oldRowIds = rows.map(row => row.rowid);
await database.deleteRows(dbId, 'logs', oldRowIds);
console.log(`Deleted ${oldRowIds.length} old rows`);
uploadCSV(databaseId, tableId, docId)
Bulk import CSV data from a library file.
Parameters:
databaseId(string, required): Database IDtableId(string, required): Table IDdocId(string, required): Library document ID containing CSV data
Returns: Promise<Object> - Import result
Example:
const { LibraryClient, DatabaseClient } = require('@intellectible/execution-sdk');
const library = new LibraryClient();
const database = new DatabaseClient();
// Find CSV file in library
const files = await library.listFiles();
const csvFile = files.find(f => f.name === 'import.csv');
// Create table with matching columns
const dbId = 'db_analytics';
await database.createTable(dbId, 'imports');
await database.createColumn(dbId, 'imports', 'name', 'text');
await database.createColumn(dbId, 'imports', 'value', 'number');
// Upload CSV data
const result = await database.uploadCSV(dbId, 'imports', csvFile.id);
console.log('Import result:', result);
CSV Requirements:
- First row must be column headers
- Headers must match existing table columns
- Values must match column types
Python SDK
Import
from intellectible_execution import DatabaseClient
database = DatabaseClient()
Database Operations
list_databases()
List all databases in the project.
Returns: List[Dict]
Example:
databases = database.list_databases()
print(f'Found {len(databases)} databases')
for db in databases:
print(f'- {db["name"]} (ID: {db["id"]})')
Table Operations
list_tables(database_id)
List all tables in a database.
Parameters:
database_id(str, required): Database ID
Returns: List[str]
Example:
tables = database.list_tables(db_id)
print(f'Found {len(tables)} tables')
for table in tables:
print(f'- {table}')
create_table(database_id, table_id)
Create a new table.
Parameters:
database_id(str, required): Database IDtable_id(str, required): Table name
Returns: str - Table ID
Example:
table_id = database.create_table(db_id, 'users')
print('Table created:', table_id)
delete_table(database_id, table_id)
Delete a table.
Parameters:
database_id(str, required): Database IDtable_id(str, required): Table ID
Returns: None
Column Operations
create_column(database_id, table_id, col_id, col_type, ...)
Add a column to a table.
Parameters:
database_id(str, required): Database IDtable_id(str, required): Table IDcol_id(str, required): Column namecol_type(str, required): Column typecol_type_param(Any, optional): Type parametercol_is_unique(bool, optional): Unique constraintcol_is_indexed(bool, optional): Create index
Returns: str - Column ID
Example:
# Text column
database.create_column(db_id, 'users', 'name', 'text')
# Email with unique constraint
database.create_column(
db_id, 'users', 'email', 'text',
col_is_unique=True,
col_is_indexed=True
)
# Numeric age
database.create_column(db_id, 'users', 'age', 'number')
# Boolean active flag
database.create_column(db_id, 'users', 'is_active', 'boolean')
# JSON metadata (use jsonb for better performance)
database.create_column(db_id, 'users', 'metadata', 'jsonb')
# Vector embeddings (1536 dimensions for OpenAI embeddings)
database.create_column(
db_id, 'documents', 'embedding', 'vector',
col_type_param=1536
)
delete_column(database_id, table_id, col_id)
Remove a column from a table.
Parameters:
database_id(str, required): Database IDtable_id(str, required): Table IDcol_id(str, required): Column ID
Returns: None
Row Operations
query(database_id, table_id, sql_query)
Run a SQL query against a table. Returns matching rows.
Parameters:
database_id(str, required): Database IDtable_id(str, required): Table IDsql_query(str, required): SQL query to execute
Returns: List[Dict] - List of row dictionaries
Example:
# Query all rows (with limit)
rows = database.query(db_id, 'users', 'SELECT * FROM "users" LIMIT 20')
print(f'Found {len(rows)} rows')
for row in rows:
print(f'User: {row["name"]}, Email: {row["email"]}')
# Filter with WHERE clause
active_users = database.query(db_id, 'users',
'SELECT * FROM "users" WHERE "is_active" = true'
)
insert_rows(database_id, table_id, rows)
Insert one or more rows into a table.
Parameters:
database_id(str, required): Database IDtable_id(str, required): Table IDrows(List[Dict], required): List of row dictionaries with column values
Returns: List[str] - List of row IDs
Example:
from datetime import datetime
# Insert a single row
row_ids = database.insert_rows(db_id, 'users', [{
'name': 'Alice',
'email': 'alice@example.com',
'age': 30,
'created_at': datetime.now().isoformat()
}])
print('Row inserted:', row_ids[0])
# Insert multiple rows at once
row_ids = database.insert_rows(db_id, 'users', [
{'name': 'Bob', 'email': 'bob@example.com', 'age': 25},
{'name': 'Charlie', 'email': 'charlie@example.com', 'age': 35}
])
print('Rows inserted:', row_ids)
update_cell(database_id, table_id, row_id, col_id, new_value)
Update a single cell value.
Parameters:
database_id(str, required): Database IDtable_id(str, required): Table IDrow_id(str, required): Row IDcol_id(str, required): Column IDnew_value(Any, required): New value
Returns: None
Example:
# Update user's age
database.update_cell(db_id, 'users', row_id, 'age', 31)
# Update JSON metadata
metadata = {'source': 'manual', 'verified': True}
database.update_cell(db_id, 'users', row_id, 'metadata', json.dumps(metadata))
print('Cell updated')
delete_rows(database_id, table_id, row_ids)
Delete multiple rows.
Parameters:
database_id(str, required): Database IDtable_id(str, required): Table IDrow_ids(List[str], required): Row IDs to delete
Returns: None
Example:
from datetime import datetime
# Delete specific rows
database.delete_rows(db_id, 'users', ['row_1', 'row_3', 'row_5'])
# Query and delete old records
rows = database.query(db_id, 'logs',
'SELECT * FROM "logs" WHERE "created_at" < \'2025-01-01\''
)
old_row_ids = [row['rowid'] for row in rows]
database.delete_rows(db_id, 'logs', old_row_ids)
print(f'Deleted {len(old_row_ids)} old rows')
upload_csv(database_id, table_id, doc_id)
Bulk import CSV data from a library file.
Parameters:
database_id(str, required): Database IDtable_id(str, required): Table IDdoc_id(str, required): Library document ID
Returns: Dict - Import result
Example:
from intellectible_execution import LibraryClient, DatabaseClient
library = LibraryClient()
database = DatabaseClient()
# Find CSV file
files = library.list_files()
csv_file = next(f for f in files if f['name'] == 'import.csv')
# Create table
db_id = 'db_analytics'
database.create_table(db_id, 'imports')
database.create_column(db_id, 'imports', 'name', 'text')
database.create_column(db_id, 'imports', 'value', 'number')
# Upload CSV
result = database.upload_csv(db_id, 'imports', csv_file['id'])
print('Import result:', result)
Complete Example
Build Analytics Workflow
JavaScript:
const { DatabaseClient } = require('@intellectible/execution-sdk');
const fs = require('fs');
async function main() {
const database = new DatabaseClient();
// Read inputs
const inputs = JSON.parse(fs.readFileSync('inputs.json', 'utf8'));
// Use an existing database
const databases = await database.listDatabases();
const dbId = databases[0].id;
// Create execution log table
await database.createTable(dbId, 'executions');
await database.createColumn(dbId, 'executions', 'workflow_name', 'text');
await database.createColumn(dbId, 'executions', 'status', 'text');
await database.createColumn(dbId, 'executions', 'duration_ms', 'number');
await database.createColumn(dbId, 'executions', 'executed_at', 'text');
// Log current execution
await database.insertRows(dbId, 'executions', [{
workflow_name: inputs.workflowName,
status: 'success',
duration_ms: inputs.duration,
executed_at: new Date().toISOString()
}]);
// Query recent executions
const rows = await database.query(dbId, 'executions',
'SELECT * FROM "executions" ORDER BY "executed_at" DESC LIMIT 10'
);
const output = {
databaseId: dbId,
recentExecutions: rows.length,
averageDuration: rows.reduce((sum, r) => sum + r.duration_ms, 0) / rows.length
};
// Write output
fs.writeFileSync('output.json', JSON.stringify(output, null, 2));
}
main().catch(err => console.error(err));
Python:
from intellectible_execution import DatabaseClient
from datetime import datetime
import json
def main():
database = DatabaseClient()
# Read inputs
with open('inputs.json', 'r') as f:
inputs = json.load(f)
# Use an existing database
databases = database.list_databases()
db_id = databases[0]['id']
# Create execution log table
database.create_table(db_id, 'executions')
database.create_column(db_id, 'executions', 'workflow_name', 'text')
database.create_column(db_id, 'executions', 'status', 'text')
database.create_column(db_id, 'executions', 'duration_ms', 'number')
database.create_column(db_id, 'executions', 'executed_at', 'text')
# Log current execution
database.insert_rows(db_id, 'executions', [{
'workflow_name': inputs['workflowName'],
'status': 'success',
'duration_ms': inputs['duration'],
'executed_at': datetime.now().isoformat()
}])
# Query recent executions
rows = database.query(db_id, 'executions',
'SELECT * FROM "executions" ORDER BY "executed_at" DESC LIMIT 10'
)
avg_duration = sum(r['duration_ms'] for r in rows) / len(rows)
output = {
'database_id': db_id,
'recent_executions': len(rows),
'average_duration': avg_duration
}
# Write output
with open('output.json', 'w') as f:
json.dump(output, f)
if __name__ == '__main__':
main()
Error Handling
Both SDKs raise exceptions for failed operations:
JavaScript:
try {
const result = await database.query(dbId, 'nonexistent_table',
'SELECT * FROM "nonexistent_table" LIMIT 1');
} catch (error) {
console.error('Query failed:', error.message);
// Handle error appropriately
}
Python:
try:
result = database.query(db_id, 'nonexistent_table',
'SELECT * FROM "nonexistent_table" LIMIT 1')
except Exception as error:
print(f'Query failed: {error}')
# Handle error appropriately
Best Practices
- Pin column types - Use explicit types to prevent data issues
- Add indexes - Set
colIsIndexed: truefor frequently queried columns - Unique constraints - Use
colIsUnique: truefor fields like email, username - Use LIMIT - Always add a LIMIT clause to queries on large tables
- Batch operations - Use
insertRowswith multiple rows oruploadCSVfor bulk imports - Error handling - Wrap database operations in try-catch blocks
Limitations
- Execution timeout: All operations must complete within the 10-minute code node timeout
See Also
- LibraryClient SDK - Access project files
- Databases Guide - Conceptual overview
- Code Node Data Access Guide - How it works