Apache Kafka is an open-source, distributed event streaming platform used by most of the organizations for high-performance data pipelines, streaming analytics, and data integration. Kafka can process data streams as they occur and store them in a fault-tolerant storage. When Kafka is paired with SAS Event Stream Processing (ESP), it creates a powerhouse for real-time decision-making. The SAS ESP includes a Kafka connector for streaming data to and from Kafka. The SAS ESP Kafka connector communicates with a Kafka broker for publish and subscribe operations.
This post covers the essential steps for integrating SAS Event Stream Processing (ESP) with Apache Kafka, focusing on the connector configurations required for seamless data flow.
Prerequisites:
Before configuring and using SAS Event Stream Processing (ESP) with Kafka, ensure the following requirements are met.
The Kafka Connector is built directly into the SAS ESP engine. It is the go-to choice for high-performance, low-latency requirements where the connection is managed as part of the project model.
To connect to Kafka, you need the Kafka broker information, including the hostname and port number.
The following pics describe a Kafka instance and its Broker information.
The subscriber connector of a source window in a SAS ESP project facilitates data transmission to Kafka storage. To establish a connection with the Kafka instance, the following parameters are required for the subscriber connector settings page.
Example :
Kafkahostport : kafka-broker-1.kafka.svc.cluster.local:28092
Kafkatopic : sasesm
KafkaType : csv
Urlhostport : unused:33333
The following images illustrate the configuration of the subscriber connector in a source window linked to a Kafka instance.
When a SAS ESP project is executed, where the source window is configured with both an input data connector and a Kafka subscriber connector, it automatically streams and writes data to the designated Kafka topics
The following images illustrate the active SAS ESP project, featuring a source window configured with both an input data connector and a Kafka subscriber connector.
The screenshots below illustrate the specific Kafka topics used as storage targets for the streaming data generated by the SAS ESP project.
Similarly, the publisher connector of the source window in a SAS ESP project facilitates data transmission from the Kafka storage to the ESP source window. The Publisher connector enables the ESP source window to ingest events from the source environment.
The following images illustrate the configuration of the publisher connector in a source window linked to a Kafka instance.
The following images illustrate the active SAS ESP project, featuring a source window configured with an input data connector (publisher) to Kafka.
Important Links: Using the Kafka Connector and Adapter
Dive into keynotes, announcements and breakthroughs on demand.
Explore Now →The rapid growth of AI technologies is driving an AI skills gap and demand for AI talent. Ready to grow your AI literacy? SAS offers free ways to get started for beginners, business leaders, and analytics professionals of all skill levels. Your future self will thank you.