Enterprise use cases often involve the integration of data from a variety of sources and analytics involving logs and activity streams. For instance, if you are building an e-commerce application backed by a relational database, such as MySQL, then you might need to combine incoming order data with other downstream data for the purpose of operational analytics. Relational databases maintain a transaction log that records every event in the database. An update, an insert, a delete - all go into the database's transaction log. Rather than moving all the order data in bulk, CDC approaches are used, allowing you to stream every single event from the database as it occurs into a streaming platform like Apache Kafka.
By consuming the logs from MySQL, data integration tools such as Airbyte are able to extract data changes at very low latency using CDC and deliver this seamlessly to Kafka, which serves as a central integration point for numerous downstream data feeds in and out. This recipe will explain how to sync data from a MySQL database to Kafka using CDC. The process is similar for all Airbyte database sources that support CDC like Postgres and MSSQL.
Change Data Capture (CDC) is an efficient replication technology that allows row-level data changes at the source database to be quickly identified, captured, and delivered in real-time to the destination database store. With CDC in use, only the data that has changed — categorized by insert, update, and delete operations — since the last replication is transferred.
Here are the tools you’ll need to get started on replicating data from a MySQL database to Kafka.
In this recipe, we will create a MySQL database in our hosted instance, and load it with some sample data.
To set up the sample data, first, download the repo at the link here. Once, downloaded, run the following commands to set up the ‘employees’ database, its associated tables and then insert the sample data into these tables by running the following mysql commands.
You may be prompted to enter your MySQL password. The text between the angle brackets must be replaced with the hostname, port and username based on your MySQL instance.
Once the two scripts are run, the employees database will be created. You can view the created tables by logging into the MySQL CLI and running the following commands.
It's recommended to create a dedicated user for better permission control and auditing. Alternatively, you can use Airbyte with an existing user in your database.
To create a dedicated database user, run the following commands against your database.
The right set of permissions differ between the STANDARD and CDC replication method. For the STANDARD replication method, only SELECT permission is required. For the CDC replication method, SELECT, RELOAD, SHOW DATABASES, REPLICATION SLAVE, REPLICATION CLIENT permissions are required.
Your database user should now be ready for use with Airbyte.
It's easy to create a MySQL source through the Airbyte UI. Make sure to select CDC as the replication method. We have not used SSH in our example. If you are using a public internet network in production, we recommend using SSH tunnels.
Next, we will set up a Kafka destination in Airbyte. In this recipe, we are running Kafka in our hosted instance and will connect to it using the Kafka client setup locally. To get running with Apache Kafka client locally, follow through the quick start steps.
First, we will create a Kafka topic named 'departments' which will be used to write the CDC data.
Next, create a destination in Airbyte as follows.
For all of the remaining settings, we went with the default values that were provided
Once the source and destination are set up, you can create a connection from MySQL to Kafka in Airbyte. The in the “select the data you want to sync” section, choose the department table and select Incremental under Sync mode.
Using the sync frequency and sync mode options of Airbyte, you can get control to replicate data incrementally and schedule Airbyte to replicate this data.
Once configured, you can see your connection on the Connection tab.
Now that your connection is set up, go back into your MySQL shell and run the following commands to add, update and then delete a row in the departments table.
Next, go back to the Airbyte UI and select the connection you just created, and trigger a manual sync.
Once the sync is complete, in a new terminal window, run the following command to read the events that were persisted to the Kafka topic.
The screenshot below shows the CDC data for the row you just inserted, updated, and deleted with corresponding timestamps.
Here’s what we’ve accomplished during this recipe:
With a combination of MySQL CDC logs, Airbyte and Kafka, distributed data platforms can be kept in sync and made aware of data changes.
We know that engineering teams working on fast-moving projects need quick answers to their questions from developers who are actively developing Airbyte. Join the conversation at Airbyte’s community Slack Channel to share your ideas with over 1000 data engineers and help make everyone’s project successful.
Start breaking your data siloes with Airbyte.
Learn how to replicate data from an OnLine Transactional Processing (OLTP) database like PostgreSQL, to an OnLine Analytical Processing (OLAP) data warehouse like Snowflake.
Learn how to modify the dbt code used by Airbyte to partition and cluster BigQuery tables.