Scroll to top

Keep your data secure with the Confluent Platform

The Confluent Platform is the central nervous system for a business, uniting an organisation around an Apache Kafka based single version and source of truth. It is leveraged to store mission-critical data. Hence, enabling security features is crucial.

More customers in the finance, healthcare, government and education sectors are adopting the Confluent Platform for their mission-critical data. Consequently, security is becoming an increasing matter of importance. Due to this, Confluent has updated its platform with an abundance of features designed to solve challenging security problems. 

In this blog, we discuss three of the pertinent security elements: Secure stream processing, Secret Protection and AD/LDAP Group Support.

Secure stream processing

If you are using Apache Kafka and the Streams API for your data infrastructure, you have the following security options:

  • Data Encryption.
  • Client Authorisation.
  • Client Authentication.

These features help you implement security policies in a highly regulated, sensitive environment.

Additionally, Streams API supports client-side security features through native integration with the Apache Kafka security features. When processing data in Kafka, you can leverage Streams API to build your stream processing application.

Streams API also integrates with Kafka’s security features and implements robust data infrastructures with optional features, such as:

  • Encrypting data-in-transit between server clusters, which enables broker-to-broker communication.
  • Encrypting data-in-transit between servers/brokers and clients.
  • Enable client authentication for connections between clients and brokers.
  • Client authorisation of read/write operations by clients.

Kafka supports cluster encryption and authentication and offers security functionality and flexibility so that you can tailor your security features depending on your needs. 

Secret Protection

Secret Protection is an additional security measure; a secret is data such as passwords or other sensitive information. It leverages envelope encryption, which provides additional functionality for encrypting secrets within the configuration file. Users produce a master encryption key by specifying a master passphrase and a cryptographic salt value to generate another data encryption key.

The master encryption key encrypts the data encryption key before you store it in a secure file; both encrypt the secrets in the configuration files. Only encrypted secrets are visible, so if someone gains access, the decryption cannot take place without the master encryption key.

To set up Secret Protection, download Confluent Platform and get the CLI. Then, follow the Secret Protection tutorial that demonstrates how to encrypt a fundamental configuration parameter.

It is advisable that you generate the master encryption key. You will begin this process by choosing your passphrase and entering it into a CLI file. Next, choose the local host location for your secrets file; this is the location of the master and data encryption key and configuration parameters. From here, you can generate the master encryption key, then encrypt and update the value of a configuration parameter.

Enable Secret Protection on the destination hosts by exporting the master encryption key into the environment on every host that has a configuration file with Secret Protection. Then distribute the secrets file by copying it from the local to the destination host. Next, restart the configuration file on all hosts running the services.

AD/LDAP group support

Many enterprises standardise on AD for identity-related services. In the latest release, Confluent included access control configuration for AD/LDAP directories.

With the latest release, you can:

  • Leverage central and standard directory services such as Active Directory (AD) and cross-platform protocol Lightweight Directory Access Protocol (LDAP).
  • Manage access control lists (ACLs).
  • Manage and monitor security configurations and address the gaps faster.

With the latest release, you can use AD and LDAP groups for configuring access controls, thus, simplifying access control management

Some of the other benefits include:

  • Management of group ACLs with existing tools and APIs.
  • Ability to limit the groups returned from the LDAP server using the configuration parameter.
  • Deny access to any of the groups that a principal belongs to with the DENY rule.
  • Periodic updating of users and groups, so information is up to date with the directory server.

Security should be a top priority for enterprises, and the updates to the Confluent Platform can alleviate the burden of security management. 


The Confluent Platform is an enterprise-grade event streaming platform. It provides mission-critical reliability with streaming at enterprise scale. It also delivers sub-millisecond latency. The Confluent platform helps secure event streaming with encryption, authentication and authorisation. It is a complete event streaming platform to connect all apps and data with the broadest ecosystem of clients and connectors for Kafka. It enables you to process and respond in real-time with Kafka streams and KSQL, which is Confluent’s streaming SQL engine. It aids in monitoring and managing the health of your cluster and data streams. The Confluent Platform gives you the freedom of choice to deploy on any cloud, public or private, and stream across on-premises and public clouds with the industry’s only hybrid streaming service.

LimePoint as a Confluent Partner

Any organisation is an entirety of events. Events that have happened in the past are happening now and will continue to happen. This event streaming platform is crucial to run your business with enhanced agility. LimePoint provides a best-in-class set of services to assist with event-driven streaming from implementation to deployment to managed services to support you with high degrees of effectiveness and efficiency.

As a Confluent partner, we help our customers harness the volume of incessantly changing data by enabling Confluent Platform, based on Apache Kafka. We will harness the event streaming platform to enable your organisation to maximise the value of data, to power real-time business. Regardless of your size or industry, as a Confluent partner, we can help you buy, build, implement, service, support and run Confluent Platform.

Please visit our API and Integrations page to learn more about our services.

Related posts