ZNK-KCD Kafka : Confluent Developer Training

DURATION | FEE (S$ before GST) |
3 Days | 2,500.00 |
COURSE CODE | |
ZNK-KCD | |
VENUE | |
298 Tiong Bahru Road #08-05 Central Plaza Singapore 168730 | |
Training Hours | |
9.00am to 5.00pm |
Course Overview
ABOUT THIS COURSE
In this three-day hands-on course, you will learn how to build an application that can publish data to and subscribe to data from an Apache Kafka® cluster.
Please note that this course is a subset of the material in our 4 day Core Spring course – there is no need to take both courses. This course is recommended if you have a good working knowledge of Spring Basics (see Prerequisites) but are new to Spring Boot.
Course Overview
AT COURSE COMPLETION
Upon completion of this course, participants will understand how Spring Boot enhances the following:
- You will learn the role of Kafka in the modern data distribution pipeline, discuss core Kafka architectural concepts and components, and review the Kafka developer APIs.
- In addition to Kafka, Kafka Connect, and Kafka Streams, the course also covers other components in the broader Confluent Platform, such as the Schema Registry, the REST Proxy, and KSQL.
Audience Profile
This course is designed for application developers, ETL (extract, transform, and load) developers, and data scientists who need to interact with Kafka clusters as a source of, or destination for, data.
Pre Requisite
- Attendees should be familiar with developing in Java (preferred) or Python.
- No prior knowledge of Kafka is required.
Course Details
MODULE 1: THE MOTIVATION FOR APACHE KAFKA
- Systems Complexity
- Real-Time Processing is Becoming Prevalent
- Kafka: A Stream Data Platform
MODULE 2: KAFKA FUNDAMENTALS
- An Overview of Kafka
- Kafka Producers
- Kafka Brokers
- Kafka Consumers
- Kafka’s Use of ZooKeeper
- Kafka Efficiency
MODULE 3: KAFKA’S ARCHITECTURE
- Kafka’s Log Files
- Replicas for Reliability
- Kafka’s Write Path
- Kafka’s Read Path
- Partitions and Consumer Groups for Scalability
MODULE 4: DEVELOPING WITH KAFKA
- Programmatically Accessing Kafka
- Writing a Producer in Java
- Using the REST API to Write a Producer
- Writing a Consumer in Java
- Using the REST API to Write a Consumer
MODULE 5: MORE ADVANCED KAFKA DEVELOPMENT
- Enabling Exactly Once Sematics (EOS)
- Specifying Offsets
- Consumer Rebalancing
- Manually Committing Offsets
- Partitioning Data
- Message Durability
MODULE 6: SCHEMA MANAGEMENT IN KAFKA
- An Introduction to Avro
- Avro Schemas
- Using the Schema Registry
MODULE 7: KAFKA CONNECT FOR DATA MOVEMENT
- The Motivation for Kafka Connect
- Kafka Connect Basics
- Modes of Working: Standalone and Distributed
- Configuring Distributed Mode
- Tracking Offsets
- Connector Configuration
- Comparing Kafka Connect with Other Options
MODULE 8: BASIC KAFKA INSTALLATION AND ADMINISTRATION
- Administering Kafka
- Log Management
- Determining How Many Partitions to Specify
- Kafka Security
MODULE 9: KAFKA STREAM PROCESSING
- The Motivation for Kafka Streams
- Kafka Streams Fundamentals
- Investigating a Kafka Streams Application
- KSQL for Apache Kafka
- Writing KSQL Queries