Using AirByte to migrate and keep in sync from any database (mysql, postgresql, GCP big query, AWS redshift, DB2, MS SQL Server, Snowflake or MongoDB) to StarRocks #26676
Closed
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
AirByte has a lot of connectors. One of the connectors you can use is the source file connector which allows you to source from any database like mysql, postgresql, GCP big query, AWS redshift, DB2, MS SQL Server, Snowflake or MongoDB. Then you can use the StarRocks AirByte destination connector to load that data into StarRocks. Airbyte will also keep the data in sync through a periodic sync job that you can configure. This tutorial solves the EL part of ELT. At the end of the tutorial you should do your own T "transform" to restructure the data to what you need it to be.
You can use this tutorial as the basis for this migration. #23713
Alternative
This tutorial is to show how to quickly build a Streaming ELT job from MySQL to StarRocks using Flink CDC 3.0,including the feature of sync all table of one database, schema change evolution and sync sharding tables into one table.
https://ververica.github.io/flink-cdc-connectors/master/content/quickstart/mysql-starrocks-pipeline-tutorial.html
Beta Was this translation helpful? Give feedback.
All reactions