Custom Workflow Types
The engine supports using any identifier as a workflow type instead of being limited to the workflow keyword. This allows for more semantic and descriptive workflow definitions.
Overview
Previously, all workflow definitions had to use the workflow keyword. Now you can use any identifier that better describes the purpose of your workflow, such as project, application, feature, solution, or any custom type.
Basic Usage
Standard Workflow
workflow example {
start: Initialize
state Initialize {
action services/common/response.ResponseValue(message: "Hello", statusCode: 200)
end ok
}
}
Custom Types
project my_project {
start: Setup
state Setup {
action project/initialize.Setup()
on success -> Complete
}
state Complete {
end ok
}
}
application user_app {
start: Launch
state Launch {
action app/startup.Initialize()
on success -> Ready
}
state Ready {
action response/ResponseValue(message: "App ready", statusCode: 200)
end ok
}
}
microservice auth_service {
start: Authenticate
state Authenticate {
action auth/verify.CheckCredentials()
on success -> GenerateToken
on error -> Unauthorized
}
state GenerateToken {
action auth/token.Generate()
end ok
}
state Unauthorized {
action response/ResponseError(message: "Unauthorized", code: 401)
end error
}
}
Common Type Keywords
While you can use any identifier, these are commonly used types:
workflow
Standard workflow for general-purpose task orchestration.
workflow data_processing {
start: Process
// ...
}
feature
Groups related workflows together. Features can orchestrate multiple workflows.
feature user_management {
start: CreateUser
// Can call workflow items
}
solution
High-level orchestration that can combine features and workflows.
solution complete_system {
start: Initialize
// Can call feature and workflow items
}
project
For project-specific implementations or prototypes.
project analytics_dashboard {
start: Setup
// ...
}
application
For full application workflows.
application mobile_app {
start: Launch
// ...
}
Custom Types
Create your own types based on your domain:
pipeline data_pipeline {
start: Ingest
// ...
}
job batch_job {
start: Execute
// ...
}
task scheduled_task {
start: Run
// ...
}
Type in Code
The workflow type is available in the Flow domain object and can be used for conditional processing:
// The workflow type is available in the Flow domain object
flow := engine.GetFlow()
workflowType := flow.Type
// Use the type for conditional processing
switch workflowType {
case "project":
// Handle project-specific logic
case "application":
// Handle application-specific logic
case "workflow":
// Handle standard workflow logic
default:
// Handle custom types
}
Use Cases
1. Semantic Clarity
Use type names that clearly communicate intent:
pipeline etl_pipeline {
start: Extract
state Extract {
action data/extract.FromSource()
on success -> Transform
}
state Transform {
action data/transform.Apply()
on success -> Load
}
state Load {
action data/load.ToDestination()
end ok
}
}
2. Organizational Hierarchy
// Top-level solution
solution ecommerce_platform {
start: InitializeFeatures
// Orchestrates features
}
// Mid-level features
feature payment_processing {
start: InitializePayments
// Orchestrates workflows
}
// Low-level workflows
workflow process_payment {
start: ValidateCard
// Executes steps
}
3. Domain-Specific Language
Create types that match your business domain:
order order_fulfillment {
start: ReceiveOrder
// ...
}
shipment track_shipment {
start: CreateShipment
// ...
}
invoice generate_invoice {
start: CalculateTotal
// ...
}
Implementation Details
The workflow type is captured and propagated through the entire parsing and execution pipeline:
- Parser - Accepts any identifier as a workflow type token
- CST (Concrete Syntax Tree) - Stores the type token
- AST (Abstract Syntax Tree) - Includes a
Typefield - IR (Intermediate Representation) - Includes a
WorkflowTypefield - Schema - Includes a
typefield - Domain Flow - The
Flow.Typefield is populated from the schema
Best Practices
1. Be Consistent
Use consistent type names across your project:
// Good - consistent naming
workflow user_login { ... }
workflow user_logout { ... }
workflow user_profile { ... }
// Avoid - inconsistent types
workflow user_login { ... }
process user_logout { ... }
handler user_profile { ... }
2. Use Hierarchical Types
Match your type to the level of abstraction:
- solution - Highest level, orchestrates features
- feature - Mid level, orchestrates workflows
- workflow - Base level, executes steps
3. Choose Meaningful Names
Pick names that convey the purpose:
// Good - clear purpose
pipeline data_ingestion { ... }
validator input_validator { ... }
transformer json_transformer { ... }
// Avoid - vague names
thing my_thing { ... }
stuff process_stuff { ... }
4. Document Custom Types
When using custom types, document their purpose:
// Custom type for scheduled background jobs
// Runs periodically to clean up old data
cleanup_job database_cleanup {
start: ScanTables
// ...
}
Backward Compatibility
This feature is fully backward compatible. All existing workflows using the workflow keyword continue to work exactly as before. The workflow keyword is simply treated as one of many possible type values.
Examples
E-commerce System
module ecommerce
// Solution orchestrates the entire system
solution ecommerce_platform {
start: InitializeSystem
state InitializeSystem {
action feature user_management
on success -> StartPayments
}
state StartPayments {
action feature payment_processing
on success -> Ready
}
state Ready {
end ok
}
}
// Feature for user management
feature user_management {
start: SetupAuth
state SetupAuth {
action workflow user_authentication
on success -> SetupProfile
}
state SetupProfile {
action workflow user_profile_management
end ok
}
}
// Individual workflow for authentication
workflow user_authentication {
start: ValidateCredentials
state ValidateCredentials {
action auth/validate.Credentials()
on success -> GenerateToken
on error -> LoginFailed
}
state GenerateToken {
action auth/token.Generate()
end ok
}
state LoginFailed {
action response/ResponseError(message: "Login failed", code: 401)
end error
}
}
Data Processing Pipeline
module analytics
pipeline real_time_analytics {
start: Ingest
state Ingest {
action stream/kafka.Consume(topic: "events")
on success -> Filter
}
state Filter {
action processing/filter.Apply(rules: $constants.filterRules)
on success -> Transform
}
state Transform {
action processing/transform.Execute()
on success -> Aggregate
}
state Aggregate {
action processing/aggregate.Compute()
on success -> Store
}
state Store {
action storage/timeseries.Write()
end ok
}
}
Related Features
- Orchestration - How features and solutions orchestrate workflows
- WSL Overview - Introduction to WSL syntax
- WSL Specification - Complete grammar reference