Search results
Apache Kafka is a data streaming system used for real-time data pipelines, data integration, and event-driven systems. Learn how Kafka works with examples and use cases.
A complete collection of demos and examples for Apache Kafka, Kafka Streams, Confluent, and other real-time data streaming technologies.
11 maj 2024 · 1. Overview. In this tutorial, we’ll learn the basics of Kafka – the use cases and core concepts anyone should know. We can then find and understand more detailed articles about Kafka. 2. What Is Kafka? Kafka is an open-source stream processing platform developed by the Apache Software Foundation.
This is a curated list of demos that showcase Apache Kafka® event stream processing on the Confluent Platform, an event stream processing platform that enables you to process, organize, and manage massive amounts of streaming data across cloud, on-prem, and serverless deployments. Where to start.
13 mar 2024 · High throughput. Kafka’s well-designed architecture, which includes data partitioning, batch processing, zero-copy techniques, and append-only logs, enables it to reach high throughput and handle millions of messages per second, catering to high-velocity and high-volume data scenarios.
This Apache Kafka tutorial provided a comprehensive overview of Kafka and its key features. We have explored the various components of the Kafka cluster, including brokers, producers, and consumers and delved into the core concepts such as topics, partitions, consumer groups, commit logs and retention policy.
3 lut 2023 · Payment processing in banking. Online fraud detection. Managing inventory and supply chains. Tracking order shipments. Collecting telemetry data from Internet of Things (IoT) devices. What all these uses have in common is that they need to take in and process data in real time, often at huge scales. This is something Kafka excels at.