In modern software development, the "as-code" paradigm has become the gold standard for managing complex systems. We have Infrastructure-as-Code (IaC), Configuration-as-Code, and Policy-as-Code. This approach brings predictability, versioning, and automation to critical components. So, why should your business-critical workflows be any different?
Treating your automated processes as code is the next logical step for building robust, scalable, and collaborative systems. The .do Agentic Workflow Platform is designed from the ground up to support this methodology. By combining the power of our platform with a solid CI/CD (Continuous Integration/Continuous Deployment) pipeline, you can bring the same rigor and efficiency to your business logic as you do to your application code.
This guide will walk you through the best practices for managing your .do workflows as code, enabling you to build a powerful, automated pipeline for testing and deployment.
Before diving into the "how," let's understand the "why." Shifting from managing workflows through a UI to managing them in a Git repository offers transformative benefits:
Let's get practical. Here’s how you can structure a CI/CD pipeline, using GitHub Actions as an example, to automate the lifecycle of your .do workflows.
A well-organized repository is the foundation of a good workflow-as-code strategy. We recommend a structure that separates workflow definitions, test files, and deployment scripts.
/my-do-workflows
├── .github/
│ └── workflows/
│ └── main.yml # Your CI/CD pipeline definition
├── workflows/
│ ├── customer-onboarding.json
│ └── monthly-reporting.json
├── tests/
│ └── test_onboarding.py
└── scripts/
└── deploy.js
Your workflow definition files (e.g., customer-onboarding.json) would contain the declarative structure of your workflow that the .do platform executes.
The CI stage runs on every push to your repository, especially on pull requests. Its goal is to ensure that any proposed changes are valid and don't break existing functionality.
A. Linting & Validation: The first check is simple: is the JSON or YAML valid? This basic step catches syntax errors instantly.
B. Integration Testing: This is where the .do SDK shines. You can write scripts that use the SDK to run a workflow in a dedicated testing environment and assert that the output is correct.
Here’s an example using the Python SDK to test our customer-onboarding-workflow.
tests/test_onboarding.py
import os
import time
from do_sdk import Do # Fictional SDK import for clarity
# Initialize the client with a test-specific API key
do_client = Do(api_key=os.getenv("DO_API_KEY_TESTING"))
def test_customer_onboarding_success():
"""
Tests the happy path for the customer onboarding workflow.
"""
mock_customer_data = {
"name": "Jane Doe",
"email": "jane.doe@example.com",
"plan": "premium"
}
# Run the workflow using the definition from our repo
# (Assuming a helper function 'load_workflow_def' exists)
# workflow_def = load_workflow_def('workflows/customer-onboarding.json')
result = do_client.workflows.run(
name="customer-onboarding-workflow",
input={"customer": mock_customer_data}
)
assert result.run_id is not None
# Poll for completion and check the final status
status = do_client.workflows.get_status(result.run_id)
while status not in ['COMPLETED', 'FAILED']:
time.sleep(2)
status = do_client.workflows.get_status(result.run_id)
assert status == 'COMPLETED'
Once a pull request is tested, validated, and merged into your main branch, the CD stage can automatically deploy the changes to your .do account.
This is handled by a deployment script that uses the SDK to create or update workflows on the platform.
scripts/deploy.js
import { Do } from '@do-platform/sdk';
import * as fs from 'fs';
import * as path from 'path';
// Use a production or staging key based on the environment
const doClient = new Do({
apiKey: process.env.DO_API_KEY_PROD,
});
async function deployWorkflows() {
const workflowsDir = path.join(__dirname, '../workflows');
const workflowFiles = fs.readdirSync(workflowsDir)
.filter(file => file.endsWith('.json'));
for (const file of workflowFiles) {
console.log(`Deploying ${file}...`);
const filePath = path.join(workflowsDir, file);
const workflowDef = JSON.parse(fs.readFileSync(filePath, 'utf-8'));
try {
// 'upsert' creates the workflow if it doesn't exist,
// or updates it if it does.
await doClient.workflows.upsert({
name: workflowDef.name,
definition: workflowDef.definition,
// other metadata...
});
console.log(`Successfully deployed ${workflowDef.name}.`);
} catch (error) {
console.error(`Failed to deploy ${workflowDef.name}:`, error);
process.exit(1); // Fail the pipeline
}
}
}
deployWorkflows();
Now, let's define the pipeline that executes these steps.
.github/workflows/main.yml
name: CI/CD for .do Workflows
on:
push:
branches:
- main
pull_request:
branches:
- main
jobs:
test:
name: Validate and Test Workflows
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@v3
- name: Setup Python
uses: actions/setup-python@v4
with:
python-version: '3.10'
- name: Install dependencies
run: pip install -r requirements.txt # (Assuming a requirements file)
- name: Run Integration Tests
env:
DO_API_KEY_TESTING: ${{ secrets.DO_API_KEY_TESTING }}
run: pytest tests/
deploy:
name: Deploy to Production
needs: test # This job only runs if 'test' succeeds
if: github.ref == 'refs/heads/main' && github.event_name == 'push'
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@v3
- name: Setup Node.js
uses: actions/setup-node@v3
with:
node-version: 18
- name: Install JS Dependencies
run: npm install
- name: Deploy Workflows
env:
DO_API_KEY_PROD: ${{ secrets.DO_API_KEY_PROD }}
run: node scripts/deploy.js
This pipeline ensures that:
By adopting a workflows-as-code approach with .do, you're not just automating a business process; you're building a reliable, maintainable, and scalable automation engine. You gain the confidence that comes from robust testing and the agility that comes from automated deployment.
Ready to bring the power of CI/CD to your business logic? Explore the official .do SDK documentation and get your API key to start building today.
Q: What is the .do SDK?
A: The .do Software Development Kit (SDK) is a set of tools, libraries, and documentation that allows developers to easily integrate the .do agentic workflow platform directly into their own applications and automation pipelines.
Q: Which programming languages are supported?
A: We provide official SDKs for popular languages like TypeScript/JavaScript, Python, and Go. Check our developer documentation for the most up-to-date list and community-supported libraries.
Q: How do I get an API key?
A: You can generate an API key from your .do account dashboard. Navigate to the 'API Settings' section to create and manage your keys securely. We recommend using separate keys for testing and production environments.
Q: Is the SDK free to use?
A: Yes, our SDKs are free to download and use. Usage of the underlying .do platform is subject to our standard pricing plans, which are based on workflow executions and agent usage.