Kafka Client Example

kafka = Aws:: Kafka:: Client. In the previous section, you defined a set of user credentials that are authenticated by the Kafka broker. The Kafka APIs for most programming languages are pretty straightforward and simple, but there is a lot of complexity under the hood. For ease of use. Package sarama is a pure Go client library for dealing with Apache Kafka (versions 0. In this tutorial, you learn how to:. Let's get to it!. Click the Apache Kafka Client JAR link to download the JAR file. In addition, the broker properties are loaded from the broker. Learn how to connect to Kafka from development clients using a VPN gateway, or from clients in your on-premises network by using a VPN gateway device. Does any C/C++ Kafka client have this kind of consumer group support built in? Thanks,. A working producer example can be found here: ProducerExample. kafka » kafka-streams-upgrade-system-tests-11 Apache. Thanks Andreas!. If the linked compatibility wiki is not up-to-date, please contact Kafka support/community to confirm compatibility. Older or newer brokers may not support certain features. I am new with Kafka, can you please provide an example of reading message one by one, and only commiting once you have processed the message. Kafka Monitor allows you to monitor a cluster using end-to-end pipelines to obtain vital statistics such as end-to-end latency, service availability and message loss rate. Confluent Platform includes the Java consumer shipped with Apache Kafka®. The library follows the librdkafka API closely where possible. The version of the client it uses may change between Flink releases. However, if any doubt occurs regarding Kafka clients, feel free to ask through the comment section. In this scenario, the light sensor needs to talk to the LED, which is an example of M2M communication. High-level Consumer ¶ * Decide if you want to read messages and events from the `. For example, the following properties allow the stage to use SSL/TLS to connect to Kafka with client authentication: Enabling Kerberos (SASL) When you use Kerberos authentication, Data Collector uses the Kerberos principal and keytab to connect to Kafka. This tutorial uses the kafka-console-producer and kafka-console-consumer scripts to generate and display Kafka messages. It is easy to set up a local instance for testing. Lenses for Apache Kafka allows among others, to browse data on Kafka Topics. Let’s get started. When first time I was trying to develop some Kafka. Also submitted to GroupCoordinator for logging with respect to consumer group administration. They are extracted from open source Python projects. kafka = Aws:: Kafka:: Client. F# client for Kafka. For example, to start Kafka Monitor and begin monitoring a cluster, use the following script where you add the parameters specific to your cluster:. One of the interesting things about the Kafka project, is that the implementation for Kafka clients (other than the default jvm client) is not maintained by the project. 2 and newer. In order to demonstrate this behavior we will start from a previous Spring Kafka tutorial in which we send/receive messages to/from an Apache Kafka topic using Spring Kafka. Hi, I use such metrics as: - the position in google search - the number of releases, the current release number, no. What Kafka needs is an improvement to its low level API and a good client that provides middle level API with good quality. 0 and newer client versions, and works with existing Kafka applications, including MirrorMaker - all you have to do is change the connection string and start streaming events from your applications that use the Kafka protocol into Event Hubs. 8 Direct Stream approach. Kafka Tutorial 13: Creating Advanced Kafka Producers in Java Slides. Advanced use cases for librdkafka are documented in the librdkafka wiki. Provides Kafka FETCH and OFFSETS requests. 3+ Docker Compose to start an Apache Kafka development cluster. clients package. Apache Kafka is an open-source stream-processing software platform developed by LinkedIn and donated to the Apache Software Foundation, written in Scala and Java. For example, the sample app doesn't have a message-specific key, so we'll use null for the key. Apache Maven 3. Kafka clients (producer, consumer, …) are set up to authenticate and authorize themselves with a Kafka broker by following the following two steps. Much of the configuration is done via. commits, no. kafka-python is designed to function much like the official java client, with a sprinkling of pythonic interfaces (e. Real-time analysis of data coming from IoT devices or user actions on a website are a couple of basic examples that Kafka Streams can easily handle. kafka = Aws:: Kafka:: Client. We will have a separate consumer and producer defined in java that will produce message to the topic and also consume message from it. , consumer iterators). Features: High performance - confluent-kafka-dotnet is a lightweight wrapper around librdkafka, a finely tuned C client. Package kafka provides high-level Apache Kafka producer and consumers using bindings on-top of the librdkafka C library. This kafka tutorial session will explain how to correctly configure kafka consumer client and optimizations in kafka consumer to make it production ready. If you're using Maven, you can replace the Kafka client dependency with the following Pulsar client dependency:. 8 Direct Stream approach. cloud spring-cloud-stream-binder-kafka. The Kafka Producer API allows applications to send streams of data to the Kafka cluster. Apache Kafka License: Apache 2. What Kafka needs is an improvement to its low level API and a good client that provides middle level API with good quality. Here is a simple example of using the producer to send records with strings containing sequential numbers as the key/value pairs. Net implementation of the Apache Kafka Protocol that provides basic functionality through Producer/Consumer classes. You can also configure Kafka Producer to determine the topic to write to at runtime. The Kafka Producer API allows applications to send streams of data to the Kafka cluster. It supports all content types: primitives | Json | Avro and you can browse data by offset. This universal Kafka connector attempts to track the latest version of the Kafka client. Over the last few months Apache Kafka gained a lot of traction in the industry and more and more companies explore how to effectively use Kafka in their production environments. F# client for Kafka. We're fans of his work and were very happy to have been able to leverage rdkafka-dotnet as the basis of this client. client_id (str) - a name for this client. The Confluent Python client confluent-kafka-python leverages the high performance C client librdkafka (also developed and supported by Confluent). Hence, in this Kafka Serialization and Deserialization tutorial, we have learned to create a custom Kafka SerDe example. Welcome folks,Read about microservices/ event-driven architecture first. You can set the topic dynamically by using a format string to access any event field. Kafka is a distributed publish-subscribe messaging system. These scripts read from STDIN and write to STDOUT and are frequently used to send and receive data via Kafka over the command line. Write An Apache Kafka Custom Partitioner. In this scenario, the light sensor needs to talk to the LED, which is an example of M2M communication. After googling a bit I found this project: Kafka Connect Azure IoT Hub. There are two components of the processor client: A "lower-level" processor that providea API's for data-processing, composable processing and local state storage. This week we follow up by showcasing how you can use that Kafka producer in conjunction with NodeJS and SocketIO to push real-time. This consumer consumes messages from the Kafka Producer you wrote in the last tutorial. Confluent Platform includes the Java consumer shipped with Apache Kafka®. The client must be configured with at least one broker. This would let someone come to the Kafka site who is not a java developer, and click on the link for their language and view examples of interacting with. Welcome folks,Read about microservices/ event-driven architecture first. It will transparently handle the failure of servers in the Kafka cluster, and transparently adapt as partitions of data it fetches migrate within the cluster. In this example, we'll be feeding weather data into Kafka and then processing this data from Spark Streaming in Scala. Net implementation of the Apache Kafka Protocol that provides basic functionality through Producer/Consumer classes. Also submitted to GroupCoordinator for logging with respect to consumer group administration. In this post we develop a Spring Boot Admin Server and Client module. How to detect the offset doesn't ex. kafka-python is best used with newer brokers (0. confluent-kafka-dotnet is Confluent's. kafka » kafka-clients Apache Kafka. Moreover, we saw the need for serializer and deserializer with Kafka. It has features as. On the Kafka Producer side of things, check out kafka-console-producer examples. The Kafka Avro client is a Python package extending the basic capabilities of Confluent's Kafka client. cloud spring-cloud-stream-binder-kafka. This article explains how to write Kafka messages to Kafka topic (producer) and read messages from topic (consumer) using Scala example; producer sends messages to Kafka topics in the form of records, a record is a key-value pair along with topic name and consumer receives a messages from a topic. com/stealthly/go_kafka_client. In earlier versions of kafka, partition balancing was left to the client. Use Kafka with C# Menu. A sample of configuration file for the Kafka producer is as follows:. Functionally, of course, Event Hubs and Kafka are two different things. Consumer are provided. The best way to test 2-way SSL is using Kafka console, we don't have to write any line of code to test it. For the nodejs client, kafka has a producer. The aim of the Processor API is to introduce a client to enable processing data consumed from Kafka and writing the results back into Kafka. For example, to start Kafka Monitor and begin monitoring a cluster, use the following script where you add the parameters specific to your cluster:. 9 release, it uses the Kafka 2. 2 and newer. KAFKA_LISTENERS is a comma-separated list of listeners, and the host/ip and port to which Kafka binds to on which to listen. kafka » kafka-streams-upgrade-system-tests-11 Apache. Dependencies. Welcome folks,Read about microservices/ event-driven architecture first. Hello, Anyone know what is the behaviour of the Kafka connector please ? The documentation is really thin, and for example, we can't set. Yes the name of the project can be misleading, but since IoT Hub is a service which relies on Event Hubs and also taking a close look to the code showed that it uses the Event Hubs client for java, I decided to give it a try. What is a Kafka Consumer ? A Consumer is an application that reads data from Kafka Topics. Starting with version 1. Hello guys, I was studying on Internet how to raise a server containing Kafka and Apache Spark but I didn't find any simple example about it, the main two problems which I found are: There is a. Here is a simple example of using the producer to send records with strings containing sequential numbers as the key/value pairs. kafka-python aims to replicate the java client api exactly. Step by step guide to realize a Kafka Consumer is provided for understanding. Apache Kafka - Simple Producer Example - Let us create an application for publishing and consuming messages using a Java client. Along with this, we also learned Avro Kafka Producer & Consumer Kafka Clients. In this post you will see how you can write standalone program that can produce messages and publish them to Kafka broker. Here is an example of 2-way SSL with Kerberos. If you're using Maven, you can replace the Kafka client dependency with the following Pulsar client dependency:. It provides an intuitive UI that allows one to quickly view objects within a Kafka cluster as well as the messages stored in the topics of the cluster. Kafka is a distributed publish-subscribe messaging system. This package is available via NuGet. Note that a Kafka topic partition is not the same as a Snowflake micro-partition. This documentation refers to Kafka::Consumer version 1. Provides an object-oriented API for consuming messages. Over the course of operating and scaling these clusters to support increasingly diverse and demanding workloads, we've learned. Click the Apache Kafka Client JAR link to download the JAR file. Ensure that the ports that are used by the Kafka server are not blocked by a firewall. Add the Confluent. For ease of use. Spark Streaming + Kafka Integration Guide (Kafka broker version 0. Demonstration of how to send Java objects as JSON and map any incoming JSON string into Java object using Java Kafka Client. This kafka tutorial session will explain how to correctly configure kafka consumer client and optimizations in kafka consumer to make it production ready. Apache Kafka has made strides in this area, and while it only ships a Java client, there is a growing catalog of community open source clients, ecosystem projects, and well as an adapter SDK allowing you to build your own system integration. Confluent Platform includes the Java consumer shipped with Apache Kafka®. com:9093 --describe --command-config client. We create a Message Producer which is able to send messages to a Kafka topic. This article describes Spark Streaming example on Consuming messages from Kafa and Producing messages to Kafka in JSON format using from_json and to_json Spark functions respectively. The example consists of an ingesting service, which is retrieving tweets from Twitter, and an aggregation service which uses Kafka Streams to aggregate word counts in tumbling time windows. These examples are extracted from open source projects. Add the Confluent. Our Kafka Azure Client Tool can be used as same fashion as Kafka client to achieve High Availability in Azure Cloud. Starting with version 1. Apache Spark Streaming is a scalable, high-throughput, fault-tolerant streaming processing system that supports both batch and streaming workloads. Kafka Connect is a framework for connecting Kafka with external systems such as databases, key-value stores, search indexes, and file systems, using so-called Connectors. Kafka is fast, scalable, and durable. 8 and later). Producer and. Please note there are multiple ways to set translate records to tuple. Normally, you'd be forced to write your stream processing in a JVM language, as that is where the only official Kafka Streams API client is. conf) would look like this:. This Pulsar client library is available at Maven Central and can be installed using Maven, Gradle, and other build tools. Apache Kafka 0. Here is an example of 2-way SSL with Kerberos. Now lets create a route which can post some message to the topic. 9+), but is backwards-compatible with older versions (to 0. Net implementation of the Apache Kafka Protocol that provides basic functionality through Producer/Consumer classes. If any consumer or broker fails to send heartbeat to ZooKeeper, then it can be re-configured via the Kafka cluster. This string is passed in each request to servers and can be used to identify specific server-side log entries that correspond to this client. Hence, in this Kafka Serialization and Deserialization tutorial, we have learned to create a custom Kafka SerDe example. pull requests, no. For these reasons and more, we took matters into our own hands. In the following example we show how to batch receive messages using a BatchListener. To keep application logging configuration simple, we will be doing spring boot configurations and stream log4j logs to apache Kafka. Modern Kafka clients are backwards compatible with broker versions 0. The original code will be reduced to a bare minimum in order to demonstrate Spring Boot's autoconfiguration. In the preceding example, the property placeholders ${kafka. In an existing application, change the regular Kafka client dependency and replace it with the Pulsar Kafka wrapper. Kafka's Java client and Kafka Streams provide millisecond latency out-of-the-box, which make them great to build data pipelines with multiple microservices than consume from Kafka and produce to other Kafka topics. In this scenario, Client 3 is a simple streams application that reads data from one or more Kafka topics, performs some processing, and writes output to another Kafka topics, all in real-time. Default: 'kafka-python-{version}'. We did this using Kotlin without problem, and actually benefited from a couple of nice features of the language. Kafka Connect is a framework for connecting Kafka with external systems such as databases, key-value stores, search indexes, and file systems, using so-called Connectors. Here, we have included the top frequently asked questions with answers to help freshers and the experienced. This plugin uses Kafka Client 2. , consumer iterators). You can vote up the examples you like or vote down the ones you don't like. Yes the name of the project can be misleading, but since IoT Hub is a service which relies on Event Hubs and also taking a close look to the code showed that it uses the Event Hubs client for java, I decided to give it a try. This example demonstrates how the consumer can be used to leverage Kafka's group management functionality for automatic consumer load balancing and failover. Adding more processes/threads will cause Kafka to re-balance. This integration not only allows you to talk to Azure Event Hubs without changing your Kafka applications, also allows you to work with some of the most demanding features of Event Hubs like Capture , Auto-Inflate , and Geo Disaster-Recovery. This allows you to use a version of Kafka dependency compatible with your kafka cluster. GitHub Gist: instantly share code, notes, and snippets. In the following example we show how to batch receive messages using a BatchListener. Kafka ecosystem needs to be covered by Zookeeper, so there is a necessity to download it, change its. It's high priority for us that client features keep pace with core Apache Kafka and components of the Confluent Platform. const { Kafka } = require ('kafkajs') // Create the client with the broker list const kafka = new Kafka({ clientId: 'my-app', brokers: ['kafka1:9092', 'kafka2:9092'] }) SSL. This tutorial uses the kafka-console-producer and kafka-console-consumer scripts to generate and display Kafka messages. 9+ Wire Format protocol. To construct a client, you need to configure a :region and :credentials. A client library has several methods that can be used, in this case, to communicate with Apache Kafka. The Kafka APIs for most programming languages are pretty straightforward and simple, but there is a lot of complexity under the hood. In an existing application, change the regular Kafka client dependency and replace it with the Pulsar Kafka wrapper. One thing to keep in mind, when producing data, is what write guarantee you want to achieve. Simply download Kafka from Apache Kafka website to the client, it includes kafka-console-producer and kafka-console-consumer in bin directory. Events()` channel (set `"go. It is easy to set up a local instance for testing. OutputFields is the list of the fields that will be emitted in tuple. This is a pet project for me and is not currently backed by a need for a Kafka server client. As we saw in the Kafka Consumer tutorial , if you are using a build tool like SBT or Maven, it's just a matter of adding the library as a dependency, for example. To keep application logging configuration simple, we will be doing spring boot configurations and stream log4j logs to apache Kafka. Learn how to directly connect to Kafka on HDInsight through an Azure Virtual Network. It is a wrapper on top of Confluent's Kafka library. Later, you will send records asynchronously. They are extracted from open source Python projects. 0 just got released , so it is a good time to review the basics of using Kafka. Spark is a different animal. There are many Kafka clients for C#, a list of some recommended options can be found here. For example, the ambient lighting in a room may be used to adjust the brightness of an LED bulb. Kafka Consumer¶. Our Kafka Azure Client Tool can be used as same fashion as Kafka client to achieve High Availability in Azure Cloud. This section gives a high-level overview of how the consumer works, an introduction to the configuration settings for tuning, and some examples from each client library. Kafka --version 1. Apache Kafka 0. On the Kafka Producer side of things, check out kafka-console-producer examples. Apache Spark Streaming is a scalable, high-throughput, fault-tolerant streaming processing system that supports both batch and streaming workloads. keytool -genkey -keystore kafka. Apache Kafka has some built-in client tools to produce and consume messages against Apache Kafka broker. In the last two tutorial, we created simple Java example that creates a Kafka producer and a consumer. We start by adding headers using either Message or ProducerRecord. Net Core by Carlos Mendible on 08 May 2017 » dotNet , dotNetCore Last week I attended to a Kafka workshop and this is my attempt to show you a simple Step by step: Kafka Pub/Sub with Docker and. 7+, Python 3. Kafka is fast, scalable, and durable. You can also configure Kafka Producer to determine the topic to write to at runtime. The project also offers balanced consumer implementation. KafkaClient(). Kafka --version 1. It provides an intuitive UI that allows one to quickly view objects within a Kafka cluster as well as the messages stored in the topics of the cluster. kafka-python is designed to function much like the official java client, with a sprinkling of pythonic interfaces (e. Adding more processes/threads will cause Kafka to re-balance. Note that this by no means is a problem of the Kafka-based message queue implementation alone, but a more general remark on how message processing pipelines should work. This tutorial will present an example of streaming Kafka from Spark. com:9093 --describe --command-config client. Download the Kafka binaries from Kafka download page Unzip the kafka tar file by executing tar -xzf kafka_2. In this post we develop a Spring Boot Admin Server and Client module. You can also configure Kafka Producer to determine the topic to write to at runtime. You will send records with the Kafka producer. After googling a bit I found this project: Kafka Connect Azure IoT Hub. KAFKA_LISTENERS is a comma-separated list of listeners, and the host/ip and port to which Kafka binds to on which to listen. Kafka clients (producer, consumer, …) are set up to authenticate and authorize themselves with a Kafka broker by following the following two steps. Pulsar provides an easy option for applications that are currently written using the Apache Kafka Java client API. Dependencies. let us assume a single partition topic with a single consumer and the last call to poll() return messages with offsets 4,5,6. We will also take a look into. Learn how to connect to Kafka from development clients using a VPN gateway, or from clients in your on-premises network by using a VPN gateway device. Use Kafka with Ruby Menu. Kafka Tutorial: Writing a Kafka Consumer in Java. In this scenario, the light sensor needs to talk to the LED, which is an example of M2M communication. Normally, you'd be forced to write your stream processing in a JVM language, as that is where the only official Kafka Streams API client is. It has features as. Developing with the Go client for Apache Kafka https://github. sh (see Kafka quick start guide) to monitor the messages published to your topic 'my-kafka-topic'. Kafunk - F# Kafka client Example. These endpoints are very helpful for getting information about applications like if they are up, if their components like. Hello guys, I was studying on Internet how to raise a server containing Kafka and Apache Spark but I didn't find any simple example about it, the main two problems which I found are: There is a. Using storm-kafka-client with different versions of kafka. A Kafka client that consumes records from a Kafka cluster. Cfg ProducerConfig - 20 examples found. 8 Direct Stream approach. Kafka Monitor allows you to monitor a cluster using end-to-end pipelines to obtain vital statistics such as end-to-end latency, service availability and message loss rate. Hi, I use such metrics as: - the position in google search - the number of releases, the current release number, no. kafka-python aims to replicate the java client api exactly. MQTT is the protocol optimized for sensor networks and M2M. Kafka Clients. Functionally, of course, Event Hubs and Kafka are two different things. Kafdrop provides a lot of the same functionality that the Kafka command line tools offer, but in a more convenient and human friendly web front end. 0) newer clients can communicate with older brokers. For ease of use. Hello, we have a requirement to run redundant Kafka consumers that consume an identical stream of messages in identical order. 9+), but is backwards-compatible with older versions (to 0. What Kafka needs is an improvement to its low level API and a good client that provides middle level API with good quality. This documentation refers to Kafka::Consumer version 1. For example, you specify the trust store location in the property kafka. GitHub Gist: instantly share code, notes, and snippets. , consumer iterators). This plugin uses Kafka Client 2. kafka-python is designed to function much like the official java client, with a sprinkling of pythonic interfaces (e. KAFKA_LISTENERS is a comma-separated list of listeners, and the host/ip and port to which Kafka binds to on which to listen. In this scenario, the light sensor needs to talk to the LED, which is an example of M2M communication. The Kafka APIs for most programming languages are pretty straightforward and simple, but there is a lot of complexity under the hood. As and when I'm ready to deploy the code to a 'real' execution environment (for example EMR), then I can start to worry about that. For more complex networking this might be an IP address associated with a given network interface on a machine. This plugin uses Kafka Client 2. NET Client for Apache Kafka is an open source library that allow developers to send (produce) and receive (consume) messages to a event streaming cluster using the Apache Kafka protocol (like Event Hubs). This universal Kafka connector attempts to track the latest version of the Kafka client. Once these 3 lines have been added, you can start a simple Kafka consumer by kafka-console-consumer. fixed issues (if most issues. Ensure that the ports that are used by the Kafka server are not blocked by a firewall. x Kafka consumer. Spring Kafka brings the simple and typical. Kafka Producer/Consumer Example in Scala. There are many Kafka clients for C#, a list of some recommended options can be found here. For example, the sample app doesn't have a message-specific key, so we'll use null for the key. These are the top rated real world C# (CSharp) examples of Kafka. Apache Kafka communication protocol is TCP based. The Kafka Handler used these properties to resolve the host and port of the Kafka brokers and properties in the Kafka producer configuration file control the behavior of the interaction between the Kafka producer client and the Kafka brokers. Demonstration of how to send Java objects as JSON and map any incoming JSON string into Java object using Java Kafka Client. Does any C/C++ Kafka client have this kind of consumer group support built in? Thanks,. Solution Setup. In this way, it is similar to products like ActiveMQ, RabbitMQ, IBM's. The methods should be used when you, for example, connect to the Kafka broker (using the given parameters, host name for example) or when you publish a message to a topic. Using storm-kafka-client with different versions of kafka. basic examples that are repeated for every client in a standardized way. As messages arrive the handler will be called with the records. Kafka is fast, scalable, and durable. if you still use the old consumer implementation, replace --bootstrap-server with --zookeeper. There are many Kafka clients for C#, a list of some recommended options can be found here. The project is a fork from ExactTarget's Kafka-net Client. Spring Kafka Consumer Producer Example 10 minute read In this post, you’re going to learn how to create a Spring Kafka Hello World example that uses Spring Boot and Maven. These programs are written in a style and a scale that will allow you to adapt them to get something close to. You can configure a default region in the following. The brokers on the list are considered seed brokers and are only used to bootstrap the client and load initial metadata. The original code will be reduced to a bare minimum in order to demonstrate Spring Boot’s autoconfiguration. Sort: popular | newest. For example, the sample app doesn't have a message-specific key, so we'll use null for the key. It provides for an implementation that covers most basic functionalities to include a simple Producer and Consumer. Older or newer brokers may not support certain features. This package is available via NuGet. Dependencies. Package kafka provides high-level Apache Kafka producer and consumers using bindings on-top of the librdkafka C library. This tool allows you to list, describe, or delete consumer groups. Here, we have included the top frequently asked questions with answers to help freshers and the experienced. Kafunk - F# Kafka client Example. It provides simple parallelism, 1:1 correspondence between Kafka partitions and Spark partitions, and access to offsets and metadata. In this tutorial series, we will be discussing about how to stream log4j application logs to apache Kafka using maven artifact kafka-log4j-appender. Net Core by Carlos Mendible on 08 May 2017 » dotNet , dotNetCore Last week I attended to a Kafka workshop and this is my attempt to show you a simple Step by step: Kafka Pub/Sub with Docker and. Our Kafka Azure Client Tool can be used as same fashion as Kafka client to achieve High Availability in Azure Cloud. Pykafka was the only python client to implement this feature. 8 and later). Confluent Platform includes the Java consumer shipped with Apache Kafka®. C# client for the Apache Kafka bus 0. The original code will be reduced to a bare minimum in order to demonstrate Spring Boot’s autoconfiguration. Apache Kafka has made strides in this area, and while it only ships a Java client, there is a growing catalog of community open source clients, ecosystem projects, and well as an adapter SDK allowing you to build your own system integration. Which means the client is only currently being tested against a small set of Kafka test servers and not against any server that has any real data load. Kafka Producer and Consumer Examples Using Java In this article, a software engineer will show us how to produce and consume records/messages with Kafka brokers. Kafdrop provides a lot of the same functionality that the Kafka command line tools offer, but in a more convenient and human friendly web front end. Kafka is a distributed publish-subscribe messaging system.