When you enroll through our links, we may earn a small commission—at no extra cost to you. This helps keep our platform free and inspires us to add more value.

Udemy logo

Hands-on Kafka Connect: Source to Sink in S3, GCS & Beyond

Master Kafka Connect with Hands-On experience: S3 Sink, Debezium MySQL CDC Source Connectors, and Connect Cluster Setup

     
  • 4.2
  •  |
  • Reviews ( 8 )
₹519

This Course Includes

  • iconudemy
  • icon4.2 (8 reviews )
  • icon2h 51m
  • iconenglish
  • iconOnline - Self Paced
  • iconprofessional certificate
  • iconUdemy

About Hands-on Kafka Connect: Source to Sink in S3, GCS & Beyond

This course is a completely dedicated to Kafka Connect and exploring its open sourced connectors. There are plenty of connectors available in Kafka Connect. To begin with, I have added a sink connector and a source connector to this course. We start this course by learning what is Kafka connect and its architecture. In the 2nd module, we learn

S3 Sink connector

in detail. At first, we learn what is s3 sink connector and we install it using Standalone mode. Next, we run the same configurations using distributed mode so that you get the clear difference between them. We explore below

Partitioner Class

with examples 1. Default Partitioner 2. Time Based Partitioner 3. Field Partitioner After that, we learn how to integration Kafka connect with

Schema Registry

and test the schema evolution in

BACKWARD

compatibility mode. Next, we learn what is

DLQ

and test it by generating invalid records to Kafka. Lastly, We automate, creating s3 sink connector using a single command with the help of Docker composer. _Module 3_ is dedicated to setting up a

Kafka connect cluster.

Here, we provision 2 machine from AWS and start s3 sink connector worker process in both machines. We thoroughly test the _Load Balancing_ _Fault Tolerance_ behaviour of our Kafka connect cluster. In _Module 4_ , we explore a popular source connector. That is

Debezium Mysql CDC Source connector.

Here, At first, we learn how Debezium CDC connector works internally. Then we start our Debezium mysql connector in distributed mode using docker commands. After that, we run

DML

statements like insert, update and delete queries and learn the respective event schema changes. Similarly, we run

DDL

statements like dropping a table etc and observe how schema history Kafka topic capture the event changes. Lastly, we integrate it with

Schema Registry

and test the setup by running DDL & DML statement.

What You Will Learn?

  • In depth knowledge on Kafka connect and it's Architecture .
  • In depth practical knowledge on running S3 sink connector in distributed mode .
  • Setting up Kafka Connect cluster .
  • Complete understanding on Debezium Mysql CDC Source connector .
  • Schema registry necessity and integrating it with sink and source connectors .
  • Schema Evolution in sink and source connectors.