DevOps Articles

Curated articles, resources, tips and trends from the DevOps World.

Data Pipeline Using MongoDB and Kafka Connect on Kubernetes

4 years ago dzone.com

Summary: This is a summary of an article originally published by the source. Read the full original article here →

In Kafka Connect on Kubernetes, the easy way!, I had demonstrated Kafka Connect on Kubernetes using Strimzi along with the File source and sink connector. This blog will showcase how to build a simple data pipeline with MongoDB and Kafka with the MongoDB Kafka connectors, which will be deployed on Kubernetes with Strimzi.

It reduces the complexity and operational overhead of managing Kubernetes by offloading much of that responsibility to Azure.

Kafka Connect will need to reference an existing Kafka cluster (which in this case is Azure Event Hubs).

If you had initially created items in source Azure Cosmos DB collection, they should have been copied over to Kafka topic (by the source connector) and subsequently persisted to the sink Azure Cosmos DB collection by the sink connector - to confirm this, query Azure Cosmos DB using any of the methods mentioned previously

Made with pure grit © 2024 Jetpack Labs Inc. All rights reserved. www.jetpacklabs.com