Working with Apache Kafka

Our Blogs

Get Course Information

Connect for information with us at

How would you like to learn?*

Get hands-on tooling experience with architecting, programming, streaming, monitoring, and tuning your data using Apache Kafka.

Apache Kafka is the industry-leading tool for real-time data pipeline processing. Kafka serves as the key solution to addressing the challenges of successfully transporting big data. Its high-scalability, fault tolerance, execution speed, and fluid integrations are some of the key hallmarks that make it an integral part of many Enterprise Data architectures.

This hands-on Apache Kakfa training workshop gets you up and running so you can immediately take advantage of the low latency, massive parallelism, and exciting use cases Kafka makes possible. Led by an enterprise engineering expert, you’ll get live instruction and coaching on how to be effective when using Kafka in your work or project.

This “skills-centric” course is about 50% hands-on lab and 50% lecture, coupling the most current techniques with the soundest industry practices. Throughout the course, you will be led through a series of progressively advanced topics, where each topic consists of lectures, group discussion, comprehensive hands-on lab exercises, and lab review.

What You’ll Learn:

  • Get Kafka up and running
  • Produce and Consume Messages
  • Write Kafka clients in Java
  • Program using Kafka API
  • Build a data streaming pipeline using Kafka Streams
  • Monitor Kafka Performance Metrics
  • Tune Kafka for Optimal Performance
  • Troubleshoot Common Kafka Issues
  • Administer and Deploy Kafka

Course Outline:

Part 1:  Introduction to Streaming Systems

  1. Fast data
  2. Streaming architecture
  3. Lambda architecture
  4. Message queues
  5. Streaming processors

Part 2:  Introduction to Kafka

  1. Architecture
  2. Comparing Kafka with other queue systems (JMS / MQ)
  3. Kaka concepts: Messages, Topics, Partitions, Brokers, Producers, commit logs
  4. Kafka & Zookeeper
  5. Producing messages
  6. Consuming messages (Consumers, Consumer Groups)
  7. Message retention
  8. Scaling Kafka
  9. Labs: Getting Kafka up and running; Using Kafka utilities

Part 3:  Programming with Kafka

  1. Configuration parameters
  2. Producer API (Sending messages to Kafka)
  3. Consumer API (consuming messages from Kafka)
  4. Commits, Offsets, Seeking
  5. Schema with Avro
  6. Lab: Writing Kafka clients in Java; Benchmarking Producer APIs

Part 4:  Kafka Streams

  1. Streams overview and architecture
  2. Streams use cases and comparison with other platforms
  3. Learning Kafka Streaming concepts (KStream, KTable, KStore)
  4. KStreaming operations (transformations, filters, joins, aggregations)
  5. Labs: Kafka Streaming labs

Part 5:  Administering Kafka

  1. Hardware / Software requirements
  2. Deploying Kafka
  3. Configuration of brokers / topics / partitions / producers / consumers
  4. Security: How secure Kafka cluster, and secure client communications (SASL, Kerberos)
  5. Monitoring: monitoring tools
  6. Capacity Planning: estimating usage and demand
  7. Troubleshooting: failure scenarios and recovery

Part 6:  Monitoring and Instrumenting Kafka

  1. Monitoring Kafka
  2. Instrumenting with Metrics library
  3. Labs; Monitor Kafka cluster
  4. Instrument Kafka applications and monitor their performance

Part 7:  Case Study / Workshop (Time-Permitting)

  • Students will build an end-to-end application simulating web traffic and send metrics to Grafana.

This course is also available on our public schedule via Live Virtual Classroom:

Contact us here.