All ETL tool comparison

Airbyte vs. Airflow

Airbyte is an open-source data integration / ETL alternative to Airflow. Compare data sources and destinations, features, pricing and more. Understand their differences and pros / cons.

Check the comparison spreadsheet
VS
Airbyte
Airflow

About the services

About Airbyte

Airbyte is the leading open-source ELT tool, created in July 2020. As of September 2021, they have built over 120 data connectors, and have 5,000 companies using them to sync data. Their ambition is to commoditize data integration by addressing the long tail of connectors through their growing contributor community. Airbyte released a Cloud offer in October 2021 with an infrastructure-type pricing model.

About Airflow

Apache AIrflow is an open-source workflow management tool. Airflow is not an ETL tool but you can use Airflow operators to extract, transform and load data between different systems. Airflow started in 2014 at Airbnb as a solution to manage the company's workflows. Airflow allows you to author, schedule and monitor workflows as DAG (directed acyclic graphs) written in Python.

Features

Focus

ELT as a first step.
Reverse-ETL coming in 2022.

Workflow Management.

Sources

More than 120, one year from inception.
Goal is 200 by end of 2021.

More than 30 sources with the transfer operators. Sources are tightly coupled with destinations.

Destinations

All data warehouses, lakes and databases.

All major data warehouses, lakes and databases. Destinations are tightly coupled with sources. 

Customizability of connectors

Users can edit any pre-built connectors and build new ones within 2 hours with Airbyte’s Connector Development Kit.

Users can edit any pre-built operator and build their own ones.

Database replication

Full table and incremental via change data capture.

Pricing adapted for this use case.

Full table replication. Incremental replication requires coding your own logic in your Airflow DAGs and SQL files to only extract new data.

Integration with data stack

Integrate deeply with Kubernetes, Airflow and dbt. 

Airbyte will soon integrate with Prefect, Dagster, Great Expectations, and more. Integrations can be contributed by the community.

Integrate deeply with Kubernetes, dbt, Airbyte and more.

Support SLAs

Available

N/A

Security certifications

SOC 2

N/A

Vendor lock-in

Airbyte Core (ELv2) and Connectors (MIT) are open source.

Airflow Core and Operators are open source.

Purchase process

Self-service or sales for Airbyte Cloud.

Open-source edition deployable in minutes.

Self-service for Managed services with Google Cloud Composer and Amazon Managed Workflows for Apache Airflow (MWAA). Sales for Astronomer.io. Open-source edition deployable in minutes.

Pricing

Infrastructure pricing based on compute time and egress cost. Credits are rolled over. 

Pricing for Cloud Composer is based on CPU, storage and egress cost. Pricing for MWAA is based on storage and compute cost. Astronomer.io’s pricing is not public.

API

Available through Airbyte Cloud and Airbyte’s open-source edition. 

Available.

Connectors

Pre-built connectors are the primary way to differentiate ETL / ELT solutions, as they enable data teams to focus only on the insights to build.

Airbyte

Within 14 months from inception, Airbyte already offers connectors for more than 120 data sources, and all major data warehouses, lakes and databases as data destinations. 

All Airbyte connectors are open sourced and can be edited to address any custom needs the customers have. Airbyte users can leverage these connectors through the open-source edition or the Cloud offer. 

Airbyte’s Connector Development Kit also enables their users to build custom connectors in a standardized way within 2 hours (instead of 2 days), and the Airbyte team and community can help in their maintenance. 

About 30% of the connectors have been contributed by the growing community. Airbyte will provide a SLA for the certified connectors, but Airbyte’s ambition is also to provide one for other connectors through the community and its participative model on the long tail of connectors and to reach 1,000+ connectors in the next few years. 

Airbyte will offer reverse-ETL connectors in 2022. 

Airflow

You can use one of the 60 available Airflow transfer operators to move data between one system to another like the PostgresToGCSOperator. Sources and destinations are tightly coupled. Because of this, you need a different transfer operator for each pair of source and destination. This makes it hard for Airflow to cover the long tail of integrations.

Transformation

Airbyte

Airbyte is an ELT tool, and does not transform data prior to loading. Airbyte offers two normalization options out of the box: a serialized JSON file and some basic normalization to get the original structure of the data at the destination level. 

Airbyte also offers custom transformations via SQL and through deep integration with dbt, allowing their users and customers to trigger their own dbt packages at the destination level right after the EL. 

Airflow

You can transform data locally with the PythonOperator, remotely with operators like the SparkSubmitOperator and in the database with operators like the BigQueryInsertJobOperator. You can also integrate Airflow with dbt for transformations.

Customizability

Every company has custom data architectures and, therefore, unique data integration needs. A lot of tools don’t enable teams to address those, which results in a lot of investment in building and maintaining additional in-house scripts. 

Airbyte

Airbyte’s architecture modularity implies that you can leverage any part of Airbyte. For instance, you can use Airflow’s orchestrator to trigger Airbyte’s ELT jobs. 

You can also edit any pre-built connectors to your own specific needs, or even leverage the Connector Development Kit to build your own custom connectors in a matter of hours (instead of days) and have its maintenance shared with the community and the Airbyte team. 

Airbyte’s promise is to address all your ELT needs and the long tail of integrations. 

Airflow

Airflow operators are split into the built-in operators and provider packages. You can modify existing operators and also create new operators on top of existing Airflow hooks.

You can scale Airflow deployments with operators and executors. For example, you can use the CeleryExecutor or the KubernetesExecutor to scale your Airflow workers. 

You can also use AIrflow to schedule ELT tasks and integrate it with Airbyte for the EL steps and dbt for the T step.

Support & docs

Data integration tools can be complex, so customers need to have great support channels. This includes online documentation as well as tutorials, email and chat support. More complicated tools may also offer training services.

Airbyte

Airbyte provides chat support directly on their web app, with an average time to respond of 1 hour. 

Their documentation is comprehensive and full of tutorials. 

Airbyte also has a Slack and Discourse community where help is available from the Airbyte team, other users or contributors.

Airbyte does not provide any training services. 

Airflow

Astronomer.io is the only service to provide premium support.

Airflow documentation is comprehensive but split over different supports. Astronomer.io also provides high quality documentation and guides.

There is a popular Airflow Slack community.

You can get Airflow training from Astronomer.io and get the Apache Airflow certification.

Pricing

Airbyte

Airbyte provides a one-month free trial or $400 worth of credits, whichever expires first. Airbyte’s pricing is credit-based, and you consume credits based on compute time and, to a lesser extent, egress costs. Airbyte positions itself as a self-service infrastructure company. 

This pricing structure adapts well to all use cases, including database replication. 

Airbyte doesn’t charge for failed syncs or normalization. 

Airbyte offers adapted pricing to customers with large volumes. 

Airflow

Cloud Composer pricing is consumption based, so you pay for what you use, based on your CPU, storage and data transfer costs.

Amazon Managed Workflows for Apache Airflow pricing is based on CPU usage from the scheduler, worker and web server. You also pay for the meta database storage.

Astronomer.io pricing is not publicly available, but they provide standard, premium and custom plans. 

Don't trust our word, trust theirs!

“I was blown away by how easy it was to get started with Airbyte. My developers are not proficient about data systems but were able to adapt a Singer tap for a missing integration in 25 mins. Because Airbyte is so simple to use, I was able to avoid a planned data engineer hire.​”
Jins Kadwood
CTO at AgriDigital
“Using Airbyte makes extracting data from various sources super easy! I don't have to spend a lot of time maintaining difficult data pipelines. Instead, I can now use that time to generate meaningful insights from data. Also, with the great support of the Airbyte developers, I was able to create a new source connector in a couple of hours.​”
Thomas Van Latum
Data Engineer & Cloud Architect at g-company
"We built a Smartsheets python source connector using Airbyte's CDK. We were able to implement and deploy this connector very quickly thanks to help from Airbyte's team and the CDK's ease of use - the first of many!"
Nathan Nowack
Data Engineer at Slate
"I used the Airbyte CDK to write 2 connectors. The experience was amazing, the setup was pretty straightforward, with just a few simple additional steps and in almost no time I was able to develop a new connector and get it running."
Murilo Nigris
Head of Data Analytics at AMARO

Getting started is easy

Start breaking your data siloes with Airbyte.