This is an example of a simple banner

Training: Apache Kafka – Fundamentals

Ref. KAF-FO
Duration:
1
 jour
Exam:
Not certifying
Level:
Fondamental

Apache Kafka Training - Fundamentals

The Apache Kafka – Fundamentals training helps you understand the key role of this system in real-time data management. Designed as a reliable and scalable streaming platform, Apache Kafka provides organizations with a way to process massive data flows quickly and accurately. Many sectors, such as finance, e-commerce, or cybersecurity, already rely on Kafka to analyze and act on their data instantly.

Why take this Apache Kafka training

In a context where data has become strategic, mastering Apache Kafka is a major asset. This Kafka course introduces the essential foundations to understand how it works, its architecture, and its real-world applications. You will learn to identify scenarios where Kafka brings real added value, such as real-time fraud detection, application integration, or microservices management. This pragmatic approach allows you to grasp the importance of a modern event streaming system in a constantly evolving digital environment.

Participant Profiles

  • Developers and software engineers
  • System and platform administrators
  • Technical architects
  • Data analysts and data engineers
  • IT managers involved in real-time projects

Objectives

  • Understand Kafka’s architecture and key concepts
  • Explore major enterprise use cases
  • Configure producers, consumers, and brokers
  • Ensure high availability and replication
  • Integrate Kafka into an existing environment
  • Leverage Kafka Connect, Schema Registry, and KSQL
  • Discover the Confluent platform and its tools

Prerequisites

  • Basic understanding of the Linux OS
  • Experience in using a shell like Bash is beneficial

Course Content

Module 1: Motivation and Customer Use Cases

  • Motivation for a paradigm change to “Event-driven”
  • How Kafka is the backbone of real-time event streaming
  • How other major players in the market use Kafka
  • Customer Use Cases
  • Microservices, IoT and Edge Computing
  • Core Banking, payments engine and fraud detection
  • Cyber Data Collection and Dissemination
  • ESB Replacement
  • Data Pipelining
  • eCommerce and Customer 360
  • Mainframe offloading

Module 2: Apache Kafka Fundamentals

  • Architecture
  • ZooKeeper’s role
  • Topics, Partitions and Segments
  • The commit log and streams
  • Brokers and Broker replication
  • Producers Basics
  • Consumers, Consumer groups and Offsets

Module 3: How Kafka Works

  • High-level code overview for a basic producer and a basic consumer
  • High Availability through Replication
  • Data Retention Policies
  • Producer Design and Producer Guarantees
  • Delivery Guarantees, including Exactly Once Semantic
  • Partition strategies
  • Consumer group rebalances
  • Compacted Topics
  • Troubleshooting strategies
  • Security overview

Module 4: Integrating Kafka into your Environment

  • Get streams of data into and out of Kafka with Kafka Connect and REST Proxy
  • Maintain data formats and ensure compatibility with Schema Registry and Avro
  • Build real-time streaming applications with Confluent KSQL & Kafka Streams

Module 5: The Confluent Platform

  • The Streaming Platform as the Central Nervous System
  • Deployment Models — on premise versus SaaS
  • The Confluent Control Center
  • Role Based Access Control (RBAC)
  • The Confluent CLI
  • Confluent Operator
  • The Confluent Hub for Certified Connecto

Documentation

Course material included.

Lab / Exercises

  • Launching and exploring a minimal Kafka cluster
  • Using Kafka command line tools to explore cluster meta data in ZooKeeper, create topics on the cluster, and publish & consume messages
  • Running a Java based consumer and observe consumer lag when scaling the consumer
  • Configuring Kafka Connect with a MQTT Connector source to create a data pipeline
  • Using Confluent Control Center to monitor your cluster and execute KSQL queries

Exam

  • This course prepares you to the Confluent Certified Developer for Apache Kafka. If you wish to take this exam, please contact our secretariat who will let you know the cost of the exam and will take care of all the necessary administrative procedures for you

Complementary Courses

Eligible Funding

ITTA is a partner of a continuing education fund dedicated to temporary workers. This fund can subsidize your training, provided that you are subject to the “Service Provision” collective labor agreement (CCT) and meet certain conditions, including having worked at least 88 hours in the past 12 months.

Additional Information

The strategic role of Apache Kafka in modern systems

Apache Kafka has become an essential component of modern data infrastructures. In a world where information flows continuously, companies must be able to collect, process, and react in real time. Kafka positions itself as a reliable and scalable solution to meet this challenge. Its ability to handle massive volumes of events while maintaining consistent performance makes it a valuable asset for organizations of all sizes. The Apache Kafka – Fundamentals training helps you understand these mechanisms and apply them in a professional context.

Why companies adopt Apache Kafka

Many sectors integrate Kafka into their architectures for diverse needs. In finance, it is used to analyze transactions and detect anomalies within seconds. Transport companies rely on it to monitor fleets of vehicles in real time. E-commerce platforms use Kafka to analyze customer journeys and personalize experiences instantly. Finally, industrial organizations use Kafka to process IoT sensor data and optimize production. These examples demonstrate that Kafka is not limited to a specific domain but can be adapted to any real-time project.

The tangible benefits of a distributed architecture

Unlike traditional approaches based on centralized databases, Kafka relies on a distributed architecture. Each message is divided into partitions, allowing multiple data streams to be processed simultaneously. This design ensures high availability and fault tolerance. If one node fails, another takes over without service interruption. For companies, this means operational continuity and a significant reduction in risks associated with data loss. This functioning is explained in detail during the course so that every participant can understand its technical and organizational implications.

Performance and reliability guarantees

One aspect that sets Kafka apart from other systems is its ability to guarantee message delivery. Through configuration options such as retention policies and Exactly Once mechanisms, Kafka ensures consistency in event processing. This reliability is crucial in sensitive contexts such as financial transactions or healthcare systems. The course emphasizes these aspects so participants can deploy robust solutions tailored to the most demanding business requirements.

Seamless integration with existing ecosystems

A major advantage of Kafka is its ability to integrate easily with heterogeneous environments. With Kafka Connect and available connectors, it is possible to link the platform to databases, messaging systems, or business applications. The schema registry ensures compatibility and smooth evolution of data streams over time. This integration optimizes the flow of information between different technology components. The training highlights concrete implementation cases to demonstrate this flexibility.

Building real-time applications with Kafka

Kafka is not limited to event transmission. With Kafka Streams and KSQL, it becomes possible to create applications capable of processing and transforming data in real time. These tools allow, for example, to compute metrics, filter information, or trigger alerts instantly. This approach enables innovative use cases in cybersecurity, marketing, logistics, and many other sectors. Participants in the course discover how to implement these features and gain immediate operational value.

The contribution of Confluent Platform

The Confluent platform, built around Apache Kafka, adds an extra layer of management and monitoring. It includes advanced tools to simplify administration, strengthen security, and accelerate deployment. The Confluent Control Center, RBAC model, and certified connector hub allow organizations to go further in leveraging Kafka. For companies, this represents time savings and better control over streaming projects. The course dedicates an important section to exploring these tools to provide a complete view of the ecosystem.

A key skill for IT professionals

Mastering Kafka is now a considerable advantage for IT professionals. Whether you are a developer, architect, or system administrator, this expertise opens opportunities in strategic projects. Organizations increasingly seek profiles capable of designing and maintaining event-driven architectures. The training provides a progressive approach, allowing participants to become familiar with the concepts while building a solid foundation to advance toward more specialized roles.

FAQ

What are the main benefits of Apache Kafka?
Kafka offers reliable event management, high scalability, and seamless integration with many systems.

Why take an Apache Kafka training course?
A guided course helps you understand the fundamentals, avoid common mistakes, and save time on real-world projects.

What types of projects use Apache Kafka?
Kafka is used in fraud detection, real-time analytics, IoT data integration, and e-commerce projects.

Do I need advanced technical skills to get started?
No, basic knowledge of Linux and distributed systems is sufficient to understand and progress effectively in the course.

What is the difference between Apache Kafka and Confluent Platform?
Apache Kafka is the open-source core, while Confluent enhances the ecosystem with management, security, and integration tools.

Prix de l'inscription
CHF 750.-
Inclus dans ce cours
  • Training provided by an industry expert
  • Digital documentation and materials
  • Achievement badge
Mois actuel

ven26Sep09:00ven17:00VirtuelVirtual Etiquettes de sessionKAF-FO

ven26Sep09:00ven17:00Lausanne, Avenue Mon repos 24, 1005 Lausanne Etiquettes de sessionKAF-FO

ven31Oct09:00ven17:00VirtuelVirtual Etiquettes de sessionKAF-FO

ven31Oct09:00ven17:00Genève, Route des Jeunes 35, 1227 Carouge Etiquettes de sessionKAF-FO

ven05Déc09:00ven17:00VirtuelVirtual Etiquettes de sessionKAF-FO

ven05Déc09:00ven17:00Lausanne, Avenue Mon repos 24, 1005 Lausanne Etiquettes de sessionKAF-FO

Contact

ITTA
Route des jeunes 35
1227 Carouge, Suisse

Opening hours

Monday to Friday
8:30 AM to 6:00 PM
Tel. 058 307 73 00

Contact-us

ITTA
Route des jeunes 35
1227 Carouge, Suisse

Make a request

Contact

ITTA
Route des jeunes 35
1227 Carouge, Suisse

Opening hours

Monday to Friday, from 8:30 am to 06:00 pm.

Contact us

Your request