Skip to content

AutoMQ x RisingWave: Build Event driven Data Stack with Kafka Ecosystem

lyx edited this page Jan 17, 2025 · 1 revision

RisingWave is a distributed streaming database that provides a standard SQL interface, fully compatible with the PostgreSQL ecosystem, and integrates seamlessly without requiring any code changes. RisingWave treats streams as tables, enabling users to perform complex queries on both streaming and historical data smoothly. With RisingWave, users can concentrate on query analysis logic without having to learn Java or the specific underlying APIs of different systems.

This article will detail the process of importing data from AutoMQ into the RisingWave database using RisingWave Cloud.

Prepare AutoMQ and test data

Follow the Stand-alone Deployment guide to deploy AutoMQ, ensuring network connectivity between AutoMQ and RisingWave.

Swiftly create a topic named example_topic in AutoMQ and write a test JSON message by following these steps.

Create Topic

Utilize the Apache Kafka command line tool to create the topic, making sure you have access to a Kafka environment and that the Kafka service is operational. Here is an example command to create a topic:

./kafka-topics.sh --create --topic exampleto_topic --bootstrap-server 10.0.96.4:9092  --partitions 1 --replication-factor 1

When executing commands, replace `topic` and `bootstrap-server` with the actual Kafka server address.

Once the topic has been created, use the following command to confirm its successful creation.

./kafka-topics.sh --describe example_topic --bootstrap-server 10.0.96.4:9092

Generating test data

Generate JSON formatted test data that corresponds to the previously mentioned table.

{
  "id": 1,
  "name": "测试用户",
  "timestamp": "2023-11-10T12:00:00",
  "status": "active"
}

Writing test data

Test data can be written to a topic named "example_topic" using Kafka's command-line tools or programmatically. Here's an example using command-line tools:

echo '{"id": 1, "name": "测试用户", "timestamp": "2023-11-10T12:00:00", "status": "active"}' | sh kafka-console-producer.sh --broker-list 10.0.96.4:9092 --topic example_topic

To view the data recently written to the topic, use the following command:

sh kafka-console-consumer.sh --bootstrap-server 10.0.96.4:9092 --topic example_topic --from-beginning

When executing commands, replace `topic` and `bootstrap-server` with the actual Kafka server address.

Create an AutoMQ source on RisingWave Cloud

  1. Navigate to Clusters on RisingWave Cloud to create a cluster.

  2. Go to Source on RisingWave Cloud to create a source.

  3. Specify the cluster and database, and log into the database.

  4. AutoMQ is fully compatible with Apache Kafka®, so just click on Create source and choose Kafka.

  5. Follow the guide on RisingWave Cloud to configure the connector, set up source information, and define the schema.

  6. Review the generated SQL statement, click Confirm to finalize the source creation.

AutoMQ defaults to port 9092 and does not have SSL enabled. To enable SSL, please refer to the Apache Kafka Documentation.

In this example, you can access all the data in the topic from the beginning by setting the startup mode to "earliest" and using the JSON format.

Query data

  1. Go to RisingWave Cloud Console and log into the cluster.

  2. Run the following SQL statement to access the imported data, replacing the variable `your_source_name` with the custom name specified when creating the source.

SELECT * from {your_source_name} limit 1;

AutoMQ Wiki Key Pages

What is automq

Getting started

Architecture

Deployment

Migration

Observability

Integrations

Releases

Benchmarks

Reference

Articles

Clone this wiki locally