Turn code into durable services

The Pythonic bridge between synchronous APIs and long-running work. Offload tasks, scale compute, and ship resilient systems without the infrastructure boilerplate.

Double River Investmentstestimonial

Prefect allows us to create a microservices-like architecture for our data pipelines, essentially acting as a contract between independent teams.

NG
Nelson Griffiths
Engineering Lead
Snorkel AIcase-study

We improved throughput by 20x with Prefect. It's our workhorse for asynchronous processing—a Swiss Army knife. We run about a thousand flows an hour and we're perfectly fine since most of these are network bound.

SS
Smit Shah
Director of Engineering

Async by Default

Don't block the main thread

Your API should be fast. Your background tasks should be durable. Prefect is the Pythonic bridge between the two, letting you offload heavy compute, AI inference, and data processing to workers that retry automatically.

worker.py
1from prefect import task
2
3# Define your task with built-in retries
4@task(retries=3, retry_delay_seconds=10)
5def process_user_event(event):
6 # ... business logic ...
7 pass
8
9if __name__ == "__main__":
10 # Start an embedded worker
11 # No separate broker required for local dev
12 process_user_event.serve(name="event-processor")
app.py
1# Fire and forget from anywhere in your app
2process_user_event.delay(event_data)

The Modern Task Queue

Kill your legacy task queue

Stop wrestling with configuration. Prefect replaces brittle queues with resilient Python code. Run embedded workers alongside your app for development, then scale them independently in production—no separate broker required.

  • No separate message broker required locally
  • Full visibility into every retry and failure
  • Type-safe inputs validated by Pydantic

Event-Driven Architecture

Event-driven without the Kafka headache

Build reactive systems with native event support. Emit events from your application, trigger workflows instantly, and let Prefect handle the routing and delivery logic.

  • Emit events from anywhere in your app
  • Trigger flows instantly with Automations
  • Decoupled producers and consumers
events.py
1from prefect.events import emit_event
2
3def on_user_signup(user):
4 # 1. Emit an event from your app
5 emit_event(
6 event="user.created",
7 resource={"prefect.resource.id": user.id},
8 payload={"email": user.email}
9 )
trigger.py
1from prefect import flow
2from prefect.events import DeploymentEventTrigger
3
4@flow
5def send_welcome_email(user_data):
6 # ... email logic ...
7 pass
8
9# 2. Trigger a flow when the event occurs
10send_welcome_email.serve(
11 name="welcome-email-sender",
12 triggers=[
13 DeploymentEventTrigger(
14 expect=["user.created"],
15 parameters={"user_data": "{{ event.payload }}"}
16 )
17 ]
18)
api.py
1from fastapi import FastAPI
2from prefect.deployments import run_deployment
3
4app = FastAPI()
5
6@app.post("/process-order")
7async def create_order(order_id: str):
8 # Offload to background worker immediately
9 run_deployment(
10 name="order-processing/prod",
11 parameters={"order_id": order_id},
12 timeout=0
13 )
14
15 return {"status": "accepted", "id": order_id}

Workflows as a Service

Instant APIs from Python functions

User experience demands speed; business logic demands reliability. Handoff complex operations to Prefect instantly, ensuring your API stays responsive while critical workflows execute durably in the background.

  • Zero-latency handoff
  • Automatic retries & error handling
  • Full observability trace

Infrastructure as Code

Compute that scales to zero

Run tasks on the hardware they need—whether that's a local process, a Kubernetes pod, or a serverless GPU on Modal. Define it in Python, deploy it anywhere.

  • Environment-agnostic execution
  • Native integration with Modal, Ray, Dask
infra_flow.py
1from prefect import flow, task
2
3# Run on specialized hardware
4@task(tags=["modal-gpu"])
5def train_model(dataset):
6 import torch
7 # ... heavy training logic ...
8 return model_weights
9
10@flow
11def ml_pipeline():
12 data = load_data()
13
14 # Executes remotely, scales to zero
15 weights = train_model(data)
16
17 save_artifacts(weights)

Engineered for Reliability

The standard for building durable, distributed systems in Python.

Workflows as Microservices

Decompose your application into independently deployable units. Scale individual components without monolith bloat.

Heavy Compute

Move slow, resource-intensive work out of the critical path. Run on background workers, serverless functions, or specialized hardware.

Event-Driven

Trigger workflows from webhooks, cloud events, or other flows. Build reactive systems that respond to the real world instantly.

Pure Python

Define workflows in standard Python. No YAML configurations, no DSLs. Native type safety with Pydantic validation built-in.

Ship code that runs until it succeeds

Join thousands of engineers building resilient applications with Prefect.

Application Engineering | Prefect