To keep the journey more enjoyable. I decided to rewrite the realtime data pipeline two years ago, which written in Go, now in Rust.

Recap the tech stacks:

  1. Docker local runtime for development.
  2. RPC (Remote Procedure Call) as protocol (we are using gRPC here).
  3. Kafka (we are using confluent Kafka PaaS). — I will write part 2 to review this.
  4. Influxdb is a time-series database.
  5. Grafana as the UI dashboard.
  6. Rust as the programming language.

Everything starts with the design.

I will skip those architecture design descriptions, and it seems I have already covered on the previous Golang Pipeline (see the link at the beginning of this article). I will straightforward to the application design.

However, this time, I’m focusing more on the one repos and multiple builds, which can increase our code maintainability and scalability in the future.

In Rust, because of its compiler check, which can also give us the benefit to design our application code structure.

Image for post

  1. The build folder holds the local and the production container build file.
  2. The proto folder contains the grpc proto files.
  3. The src/bin holds the main application logic against the use case, which will be the final bin build sources. In the cargo.toml, will indicate the build sources as

Image for post

4. The src/common holds all the re-usable logical / function across the applications.

#rustlang #rust

How to Build a Realtime Data Pipeline During in Rust - Part 1
13.95 GEEK