Apache Kafka
Apache Kafka is a community distributed event streaming platform that can handle a high volume of data and enables you to pass messages from one end point to another. It Integrates well with Apache storm and sparks for real-time streaming analysis. Apache Kafka is best when you want reliability, speed, scalability, durability, performance and zero downtime and zero data loss.
Kafka is used for operational monitoring data, across an organisation to collect logs from multiple services and make them available to multiple consumers. Thousands of companies are built on Kafka application and some of them are Twitter, LinkedIn, Netflix, Airbnb, Mozilla, Oracle, Microsoft, etc.
In this Apache Kafka course by Uplatz, you will be able to learn the basics and Understanding what Apache Kafka is, the various components and use cases of Kafka, implementing Kafka on a single node, multi-node cluster setup in Kafka, the various administration commands, leadership balancing and partition rebalancing and many more.
---------------------------------------------------------------------------------------------------------------
Apache Kafka
What is Kafka – An Introduction
Understanding what Apache Kafka is, the various components and use cases of Kafka, implementing Kafka on a single node.
Multi Broker Kafka Implementation
Learning about the Kafka terminology, deploying single node Kafka with independent Zookeeper, adding replication in Kafka, working with Partitioning and Brokers, understanding Kafka consumers, the Kafka Writes terminology, various failure handling scenarios in Kafka.
Multi Node Cluster Setup
Introduction to multi node cluster setup in Kafka, the various administration commands, leadership balancing and partition rebalancing, graceful shutdown of kafka Brokers and tasks, working with the Partition Reassignment Tool, cluster expending, assigning Custom Partition, removing of a Broker and improving Replication Factor of Partitions.
Integrate Flume with Kafka
Understanding the need for Kafka Integration, successfully integrating it with Apache Flume, steps in integration of Flume with Kafka as a Source.
Kafka API
Detailed understanding of the Kafka and Flume Integration, deploying Kafka as a Sink and as a Channel, introduction to PyKafka API and setting up the PyKafka Environment.
Producers & Consumers
Connecting Kafka using PyKafka, writing your own Kafka Producers and Consumers, writing a random JSON Producer, writing a Consumer to read the messages from a topic, writing and working with a File Reader Producer, writing a Consumer to store topics data into a file.
---------------------------------------------------------------------------------------------------------------