Graph-Based Architecture
Build dynamic workflows with interconnected TaskWorkers for highly customizable automation
Graph-Based Architecture
Build dynamic workflows with interconnected TaskWorkers for highly customizable automation
Type-Safe with Pydantic
Ensure data integrity and consistency across workflows with Pydantic-validated inputs and outputs
LLM Integration
Seamlessly combine traditional computations with LLM-powered operations and RAG capabilities
Built-in Monitoring
Track workflow execution in real-time with the integrated web dashboard and provenance tracking
from planai import Graph, TaskWorker, Task, LLMTaskWorker, llm_from_config
# Define a simple data processorclass DataProcessor(TaskWorker): output_types: List[Type[Task]] = [ProcessedData]
def consume_work(self, task: RawData): processed = self.process(task.data) self.publish_work(ProcessedData(data=processed))
# Add AI analysis with an LLMclass AIAnalyzer(LLMTaskWorker): prompt = "Analyze the data and provide insights" output_types: List[Type[Task]] = [AnalysisResult]
# Create and run the workflowgraph = Graph(name="Analysis Pipeline")processor = DataProcessor()analyzer = AIAnalyzer(llm=llm_from_config("openai", "gpt-4"))
graph.add_workers(processor, analyzer)graph.set_dependency(processor, analyzer)graph.run(initial_tasks=[(processor, RawData(data="..."))])
Intelligent Routing
Type-aware data routing automatically manages flow between nodes
Provenance Tracking
Trace task lineage through the entire workflow for debugging
Caching Support
Built-in caching for expensive operations and LLM responses
Prompt Optimization
AI-driven automatic prompt improvement using production data
Ready to build your first AI workflow? Check out our guides:
Installation Guide
Get PlanAI installed and set up your development environment Get Started →
Quick Start Tutorial
Build your first workflow in minutes with our step-by-step guide Learn More →
Examples
Explore real-world examples including research assistants and Q&A generation View Examples →