-
Notifications
You must be signed in to change notification settings - Fork 240
AutoMQ x RisingWave: Build Event driven Data Stack with Kafka Ecosystem
RisingWave is a distributed streaming database that provides a standard SQL interface, fully compatible with the PostgreSQL ecosystem, and integrates seamlessly without requiring any code changes. RisingWave treats streams as tables, enabling users to perform complex queries on both streaming and historical data smoothly. With RisingWave, users can concentrate on query analysis logic without having to learn Java or the specific underlying APIs of different systems.
This article will detail the process of importing data from AutoMQ into the RisingWave database using RisingWave Cloud.
Follow the Stand-alone Deployment guide to deploy AutoMQ, ensuring network connectivity between AutoMQ and RisingWave.
Swiftly create a topic named example_topic in AutoMQ and write a test JSON message by following these steps.
Utilize the Apache Kafka command line tool to create the topic, making sure you have access to a Kafka environment and that the Kafka service is operational. Here is an example command to create a topic:
./kafka-topics.sh --create --topic exampleto_topic --bootstrap-server 10.0.96.4:9092 --partitions 1 --replication-factor 1
When executing commands, replace `topic` and `bootstrap-server` with the actual Kafka server address.
Once the topic has been created, use the following command to confirm its successful creation.
./kafka-topics.sh --describe example_topic --bootstrap-server 10.0.96.4:9092
Generate JSON formatted test data that corresponds to the previously mentioned table.
{
"id": 1,
"name": "测试用户",
"timestamp": "2023-11-10T12:00:00",
"status": "active"
}
Test data can be written to a topic named "example_topic" using Kafka's command-line tools or programmatically. Here's an example using command-line tools:
echo '{"id": 1, "name": "测试用户", "timestamp": "2023-11-10T12:00:00", "status": "active"}' | sh kafka-console-producer.sh --broker-list 10.0.96.4:9092 --topic example_topic
To view the data recently written to the topic, use the following command:
sh kafka-console-consumer.sh --bootstrap-server 10.0.96.4:9092 --topic example_topic --from-beginning
When executing commands, replace `topic` and `bootstrap-server` with the actual Kafka server address.
-
Navigate to Clusters on RisingWave Cloud to create a cluster.
-
Go to Source on RisingWave Cloud to create a source.
-
Specify the cluster and database, and log into the database.
-
AutoMQ is fully compatible with Apache Kafka®, so just click on Create source and choose Kafka.
-
Follow the guide on RisingWave Cloud to configure the connector, set up source information, and define the schema.
-
Review the generated SQL statement, click Confirm to finalize the source creation.
AutoMQ defaults to port 9092 and does not have SSL enabled. To enable SSL, please refer to the Apache Kafka Documentation.
In this example, you can access all the data in the topic from the beginning by setting the startup mode to "earliest" and using the JSON format.
-
Go to RisingWave Cloud Console and log into the cluster.
-
Run the following SQL statement to access the imported data, replacing the variable `your_source_name` with the custom name specified when creating the source.
SELECT * from {your_source_name} limit 1;
- What is automq: Overview
- Difference with Apache Kafka
- Difference with WarpStream
- Difference with Tiered Storage
- Compatibility with Apache Kafka
- Licensing
- Deploy Locally
- Cluster Deployment on Linux
- Cluster Deployment on Kubernetes
- Example: Produce & Consume Message
- Example: Simple Benchmark
- Example: Partition Reassignment in Seconds
- Example: Self Balancing when Cluster Nodes Change
- Example: Continuous Data Self Balancing
-
S3stream shared streaming storage
-
Technical advantage
- Deployment: Overview
- Runs on Cloud
- Runs on CEPH
- Runs on CubeFS
- Runs on MinIO
- Runs on HDFS
- Configuration
-
Data analysis
-
Object storage
-
Kafka ui
-
Observability
-
Data integration