Your Airflow upgrade is coming. Make it count.
Airflow 2 is now in limited maintenance—security patches only. Airflow 3 requires significant architectural changes. Choose the platform that moves you forward.
Limited maintenance is active
Airflow 2 entered limited maintenance 71 days ago
Full Support
This phase has ended
Limited Maintenance
Security and critical bug fixes only. No new features.
End of Life
No more updates. Upgrade to Airflow 3 or switch platforms.
What actually breaks in Airflow 3
The most significant architectural changes since Airflow 2.0
No direct database access
All custom operators using metadata database directly must be completely refactored to use the new Task Execution API.
SubDAGs removed
SubDagOperator no longer exists. All SubDAGs must be converted to TaskGroups—requiring significant code rewrites.
Every import changes
Import paths overhauled from airflow.models to airflow.sdk. Every file touching Airflow needs updates.
New mandatory components
dag-processor and api-server are now required. Your deployment infrastructure needs restructuring.
Template variables removed
execution_date, prev_ds, next_ds, and other common templates are gone. DAGs using these will fail.
Cannot skip versions
Must upgrade through 2.7 → 2.10 → 3.0. No direct jumps allowed. Each step requires testing.
What developers are saying
Community sentiment from Hacker News and Reddit
Upgrades have been an absolute nightmare and so disruptive... We've since tried multiple times to upgrade past the 2.0 release and hit issues every time, so we are just done with it. We'll stay at 2.0 until we eventually move off airflow altogether.
If my company had neither Airflow nor Prefect in place already, I'd opt for Prefect. I believe it allows for much better modularization of code... You can achieve something similar with Airflow, but you really need to go out of your way to make something like that happen, whereas in Prefect it kind of naturally comes out.
I despise airflow and how cemented it is as data infrastructure... It's taken me 3 separate jobs over 7 years to realize that it's probably not our fault. Everyone seems to struggle with the same things: flaky scheduler that is slow to run tasks, confusing settings... It can't handle a large number of parallel tasks or frequent runs. It seems to have miserable scalability for the resources given.
My impression is that Airflow is a really dated choice for a greenfield project.
What Airflow 3 upgrades look like in the wild
Real issues teams hit when migrating from Airflow 2 to Airflow 3.
Upgrading from Airflow 2.10.5 to 3.0.0 failed due to database migration errors. The migration tried to drop an index that's still needed for a foreign key constraint, causing the entire upgrade process to fail.
After waiting for Airflow 3.1 to fix initial bugs, the upgrade from 2.11.0 to 3.1.0 still failed. The database migration couldn't convert the XCom table from bytea to jsonb because it contained invalid JSON data, blocking the entire upgrade.
After upgrading to Airflow 3, DAG import errors started occurring. The workaround requires manually updating dag_id parameters or deleting entries from serialized_dag and dag_code tables, which is a risky manual database operation.
After upgrading to Airflow 3.0.6, multiple executors stopped working correctly. Tasks assigned to KubernetesExecutor were being routed to CeleryExecutor instead, regardless of queue configuration, breaking our entire task execution strategy.
After upgrading from Airflow 3.0.3 to 3.1.0, KubernetesExecutor stopped working entirely. Tasks that specified KubernetesExecutor failed with errors saying the executor wasn't available, even though it was properly configured.
Upgrading from Airflow 2.11 to 3.1.3 on MySQL 8.0.35 failed because the migration used index creation syntax that MySQL doesn't support. The migration script assumed database features that don't exist in MySQL.
Teams that made the switch
Real migration stories with real results
73.78% cost reduction
Migrated 72 pipelines in 2.5 months with 2 engineers. Tripled production while dramatically reducing spend.

ML workflows migration
Moved from Airflow when it was no longer viable for ML. Gained security and ease of adoption.
70% infrastructure savings
Gradual migration focused on critical flows first. Gained stability and significant cost reduction.
The Data Engineering and MLOps teams were impressed by the elimination of retrofitting requirements. Switching from Astronomer to Prefect resulted in a 73.78% reduction in invoice costs alone.
Airflow was no longer viable for ML workflows. We needed security and ease of adoption.
Teams using Prefect in production












Same effort. Different outcome.
Both paths require work. Choose the one that moves you forward.
Upgrade to Airflow 3
- Step through 2.7 → 2.10 → 3.0 versions
- Refactor all custom operators for new API
- Update every import path in your codebase
- Convert all SubDAGs to TaskGroups
- Restructure deployment for new components
- Still have static DAGs and centralized scheduler
Migrate to Prefect
- Remove Airflow boilerplate, keep your Python logic
- Simple decorators instead of operator classes
- Dynamic workflows that adapt to your data
- 60-70% infrastructure cost savings
- Your data stays in your infrastructure
- Migration assistance from our team
We'll help you migrate
Our team has helped hundreds of organizations transition from Airflow to Prefect. We'll work with you to plan your migration and ensure a smooth transition.
- Migration assessment and planning
- Best practices for converting DAGs to flows
- Dedicated support during transition
- Transparent pricing without surprises
Start your migration assessment
Talk to our team about your Airflow infrastructure, timeline, and migration strategy. Free consultation, no commitment.