Partner – Orkes – NPI EA (cat=Spring)
announcement - icon

Modern software architecture is often broken. Slow delivery leads to missed opportunities, innovation is stalled due to architectural complexities, and engineering resources are exceedingly expensive.

Orkes is the leading workflow orchestration platform built to enable teams to transform the way they develop, connect, and deploy applications, microservices, AI agents, and more.

With Orkes Conductor managed through Orkes Cloud, developers can focus on building mission critical applications without worrying about infrastructure maintenance to meet goals and, simply put, taking new products live faster and reducing total cost of ownership.

Try a 14-Day Free Trial of Orkes Conductor today.

Partner – Orkes – NPI EA (tag=Microservices)
announcement - icon

Modern software architecture is often broken. Slow delivery leads to missed opportunities, innovation is stalled due to architectural complexities, and engineering resources are exceedingly expensive.

Orkes is the leading workflow orchestration platform built to enable teams to transform the way they develop, connect, and deploy applications, microservices, AI agents, and more.

With Orkes Conductor managed through Orkes Cloud, developers can focus on building mission critical applications without worrying about infrastructure maintenance to meet goals and, simply put, taking new products live faster and reducing total cost of ownership.

Try a 14-Day Free Trial of Orkes Conductor today.

eBook – Guide Spring Cloud – NPI EA (cat=Spring Cloud)
announcement - icon

Let's get started with a Microservice Architecture with Spring Cloud:

>> Join Pro and download the eBook

eBook – Mockito – NPI EA (tag = Mockito)
announcement - icon

Mocking is an essential part of unit testing, and the Mockito library makes it easy to write clean and intuitive unit tests for your Java code.

Get started with mocking and improve your application tests using our Mockito guide:

Download the eBook

eBook – Java Concurrency – NPI EA (cat=Java Concurrency)
announcement - icon

Handling concurrency in an application can be a tricky process with many potential pitfalls. A solid grasp of the fundamentals will go a long way to help minimize these issues.

Get started with understanding multi-threaded applications with our Java Concurrency guide:

>> Download the eBook

eBook – Reactive – NPI EA (cat=Reactive)
announcement - icon

Spring 5 added support for reactive programming with the Spring WebFlux module, which has been improved upon ever since. Get started with the Reactor project basics and reactive programming in Spring Boot:

>> Join Pro and download the eBook

eBook – Java Streams – NPI EA (cat=Java Streams)
announcement - icon

Since its introduction in Java 8, the Stream API has become a staple of Java development. The basic operations like iterating, filtering, mapping sequences of elements are deceptively simple to use.

But these can also be overused and fall into some common pitfalls.

To get a better understanding on how Streams work and how to combine them with other language features, check out our guide to Java Streams:

>> Join Pro and download the eBook

eBook – Jackson – NPI EA (cat=Jackson)
announcement - icon

Do JSON right with Jackson

Download the E-book

eBook – HTTP Client – NPI EA (cat=Http Client-Side)
announcement - icon

Get the most out of the Apache HTTP Client

Download the E-book

eBook – Maven – NPI EA (cat = Maven)
announcement - icon

Get Started with Apache Maven:

Download the E-book

eBook – Persistence – NPI EA (cat=Persistence)
announcement - icon

Working on getting your persistence layer right with Spring?

Explore the eBook

eBook – RwS – NPI EA (cat=Spring MVC)
announcement - icon

Building a REST API with Spring?

Download the E-book

Course – LS – NPI EA (cat=Jackson)
announcement - icon

Get started with Spring and Spring Boot, through the Learn Spring course:

>> LEARN SPRING
Course – RWSB – NPI EA (cat=REST)
announcement - icon

Explore Spring Boot 3 and Spring 6 in-depth through building a full REST API with the framework:

>> The New “REST With Spring Boot”

Course – LSS – NPI EA (cat=Spring Security)
announcement - icon

Yes, Spring Security can be complex, from the more advanced functionality within the Core to the deep OAuth support in the framework.

I built the security material as two full courses - Core and OAuth, to get practical with these more complex scenarios. We explore when and how to use each feature and code through it on the backing project.

You can explore the course here:

>> Learn Spring Security

Partner – LambdaTest – NPI EA (cat=Testing)
announcement - icon

Browser testing is essential if you have a website or web applications that users interact with. Manual testing can be very helpful to an extent, but given the multiple browsers available, not to mention versions and operating system, testing everything manually becomes time-consuming and repetitive.

To help automate this process, Selenium is a popular choice for developers, as an open-source tool with a large and active community. What's more, we can further scale our automation testing by running on theLambdaTest cloud-based testing platform.

Read more through our step-by-step tutorial on how to set up Selenium tests with Java and run them on LambdaTest:

>> Automated Browser Testing With Selenium

Partner – Orkes – NPI EA (cat=Java)
announcement - icon

Modern software architecture is often broken. Slow delivery leads to missed opportunities, innovation is stalled due to architectural complexities, and engineering resources are exceedingly expensive.

Orkes is the leading workflow orchestration platform built to enable teams to transform the way they develop, connect, and deploy applications, microservices, AI agents, and more.

With Orkes Conductor managed through Orkes Cloud, developers can focus on building mission critical applications without worrying about infrastructure maintenance to meet goals and, simply put, taking new products live faster and reducing total cost of ownership.

Try a 14-Day Free Trial of Orkes Conductor today.

Course – LSD – NPI EA (tag=Spring Data JPA)
announcement - icon

Spring Data JPA is a great way to handle the complexity of JPA with the powerful simplicity of Spring Boot.

Get started with Spring Data JPA through the guided reference course:

>> CHECK OUT THE COURSE

Partner – Moderne – NPI EA (cat=Spring Boot)
announcement - icon

Refactor Java code safely — and automatically — with OpenRewrite.

Refactoring big codebases by hand is slow, risky, and easy to put off. That’s where OpenRewrite comes in. The open-source framework for large-scale, automated code transformations helps teams modernize safely and consistently.

Each month, the creators and maintainers of OpenRewrite at Moderne run live, hands-on training sessions — one for newcomers and one for experienced users. You’ll see how recipes work, how to apply them across projects, and how to modernize code with confidence.

Join the next session, bring your questions, and learn how to automate the kind of work that usually eats your sprint time.

1. Introduction

In this article, we’ll learn how to implement the SASL/PLAIN authentication mechanism in a Kafka service. We’ll also implement client-side authentication using the support provided by Spring Kafka.

Kafka supports multiple authentication options, providing enhanced security and compatibility. This includes SASL, SSL, and delegated token authentication.

Simple Authentication and Security Layer (SASL) is an authentication framework that allows other authentication mechanisms such as GSSAPI, OAuthBearer, SCRAM, and PLAIN, to be easily integrated.

SASL/PLAIN authentication is not secure! This is because user credentials are exposed over the network as plaintext. However, it’s still useful for local development due to fewer configuration requirements.

We should note that SASL/PLAIN authentication should not be used in production environments unless it’s used in conjunction with SSL/TLS. When SSL is combined with SASL/PLAIN authentication, referred to as SASL-SSL in Kafka, it encrypts traffic, including sensitive credentials between the client and server.

2. Implement Kafka With SASL/PLAIN Authentication

Let’s imagine we need to build a Kafka service that supports SASL/PLAIN authentication in a Docker environment.
For that, we’ll utilize JAAS configuration to add the user credentials required by the SASL/PLAIN.

2.1. Configure Kafka Credentials

To configure user credentials in Kafka, we’ll use the PlainLoginModule security implementation.

Let’s include a kafka_server_jaas.conf file to configure the admin and user1 credentials:

KafkaServer {
  org.apache.kafka.common.security.plain.PlainLoginModule required
    username="admin"
    password="admin-secret"
    user_admin="admin-secret"
    user_user1="user1-secret";
};

In the above code, we define the admin and user1 users, to be used for both Kafka’s inter-broker and external client authentication respectively. The user1 is defined as a one-liner in this format user_<username> along with the secret.

2.2. Configure Zookeeper Credentials

As we’ve included the client user credentials in the Kafka service, we’ll also secure the Zookeeper service with the SASL/PLAIN authentication. It’s also good practice to secure the Zookeeper service.

Let’s include a zookeeper_jaas.conf file to configure the zookeeper user credentials:

Server {
  org.apache.zookeeper.server.auth.DigestLoginModule required
    username="zookeeper"
    password="zookeeper-secret"
    user_zookeeper="zookeeper-secret";
};

In the above configuration, we’re using the Zookeeper-specific security implementation DigestLoginModule instead of Kafka’s PlainLoginModule for improved compatibility.

Additionally, we’ll include the zookeeper credentials in the previously created kafka_server_jaas.conf file:

Client {
  org.apache.kafka.common.security.plain.PlainLoginModule required
    username="zookeeper"
    password="zookeeper-secret";
};

The above Client credentials are used by the Kafka service to authenticate with the Zookeeper service.

2.3. Setup Kafka Service With Zookeeper

We can set up our Kafka and Zookeeper services using a Docker Compose file.

First, we’ll implement a Zookeeper service and include the zookeeper_jaas.conf file:

services:
  zookeeper:
    image: confluentinc/cp-zookeeper:7.6.6
    environment:
      ZOOKEEPER_CLIENT_PORT: 2181
      ZOOKEEPER_TICK_TIME: 2000
      KAFKA_OPTS: "-Djava.security.auth.login.config=/etc/kafka/zookeeper_jaas.conf"
    volumes:
      - ./config/zookeeper_jaas.conf:/etc/kafka/zookeeper_jaas.conf
    ports:
      - 2181

Next, we’ll implement a Kafka service with the SASL/PLAIN authentication:

kafka:
  image: confluentinc/cp-kafka:7.6.6
  depends_on:
    - zookeeper
  environment:
    KAFKA_ZOOKEEPER_CONNECT: zookeeper:2181
    KAFKA_LISTENERS: SASL_PLAINTEXT://0.0.0.0:9092
    KAFKA_ADVERTISED_LISTENERS: SASL_PLAINTEXT://localhost:9092
    KAFKA_INTER_BROKER_LISTENER_NAME: SASL_PLAINTEXT
    KAFKA_SASL_ENABLED_MECHANISMS: PLAIN
    KAFKA_SASL_MECHANISM_INTER_BROKER_PROTOCOL: PLAIN
    KAFKA_OPTS: "-Djava.security.auth.login.config=/etc/kafka/kafka_server_jaas.conf"
    KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR: 1
  volumes:
    - ./config/kafka_server_jaas.conf:/etc/kafka/kafka_server_jaas.conf
  ports:
    - "9092:9092"

In the above code, we’ve included the previously created kafka_server_jaas.conf file to set up the SASL/PLAIN users.

We should note that the KAFKA_ADVERTISED_LISTENERS property is the endpoint that the Kafka client will send messages and listen to.

Finally, we’ll run the entire Docker setup using the docker compose command:

docker compose up --build

We’ll get similar logs in the Docker console:

kafka-1      | [2025-06-19 14:32:00,441] INFO Session establishment complete on server zookeeper/172.18.0.2:2181, session id = 0x10000004c150001, negotiated timeout = 18000 (org.apache.zookeeper.ClientCnxn)
kafka-1      | [2025-06-19 14:32:00,445] INFO [ZooKeeperClient Kafka server] Connected. (kafka.zookeeper.ZooKeeperClient)
zookeeper-1  | [2025-06-19 14:32:00,461] INFO Successfully authenticated client: authenticationID=zookeeper;  authorizationID=zookeeper. (org.apache.zookeeper.server.auth.SaslServerCallbackHandler)

We confirm that the Kafka and Zookeeper services are integrated without any errors.

3. Implement Kafka Client With Spring

We’ll implement the producer and consumer services using the Spring Kafka implementation.

3.1. Maven Dependencies

First, we’ll include the Spring Kafka dependency:

<dependency>
    <groupId>org.springframework.kafka</groupId>
    <artifactId>spring-kafka</artifactId>
    <version>3.1.2</version>
</dependency>

Next, we’ll implement a producer service to send messages.

3.2. Kafka Producer

Let’s implement a Kafka producer service using the KafkaTemplate class:

public void sendMessage(String message, String topic) {
    LOGGER.info("Producing message: {}", message);
    kafkaTemplate.send(topic, "key", message)
        .whenComplete((result, ex) -> {
            if (ex == null) {
                LOGGER.info("Message sent to topic: {}", message);
            } else {
                LOGGER.error("Failed to send message", ex);
            }
        });
}

In the above code, we’re sending a message using the send method of KafkaTemplate.

3.3. Kafka Consumer

We’ll use Spring Kafka’s KafkaListener and ConsumerRecord classes to implement the consumer service.

Let’s implement a consumer method with the @KafkaListener annotation:

@KafkaListener(topics = TOPIC)
public void receive(ConsumerRecord<String, String> consumerRecord) {
    LOGGER.info("Received payload: '{}'", consumerRecord.toString());
    messages.add(consumerRecord.value());
}

In the above code, we receive a message and add it to the messages list.

3.4. Configure Spring Application With Kafka

Next, we’ll create an application.yml file and include a few Spring Kafka-related properties:

spring:
  kafka:
    bootstrap-servers: localhost:9092
    consumer:
      group-id: test-group
      auto-offset-reset: earliest
      key-deserializer: org.apache.kafka.common.serialization.StringDeserializer
      value-deserializer: org.apache.kafka.common.serialization.StringDeserializer

Now, let’s run the application and verify the setup:

kafka-1 | [2025-06-19 14:38:33,188] INFO [SocketServer listenerType=ZK_BROKER, nodeId=1001] Failed authentication with /192.168.65.1 (Unexpected Kafka request of type METADATA during SASL handshake.) (org.apache.kafka.common.network.Selector)

As expected, the client application is unable to authenticate with the Kafka server.

3.5. Configure Client With JAAS Config

To resolve the above error, we’ll use the spring.kafka.properties configuration to provide the SASL/PLAIN settings.

Now, we’ll include a few additional configurations related to the user1 credentials and set the sasl.mechanism property to PLAIN:

spring:
  kafka:
    bootstrap-servers: localhost:9092
    properties:
      sasl.mechanism: PLAIN
      sasl.jaas.config: >
        org.apache.kafka.common.security.plain.PlainLoginModule required
        username="user1"
        password="user1-secret";
    security:
      protocol: "SASL_PLAINTEXT"

In the above code, we’ve included the matching username and password as part of the sasl.jaas.config property.

Sometimes, we can encounter common errors due to missing or incorrect SASL configuration. For example, we’ll get the below error if the sasl.mechanism property is PLAINTEXT instead of PLAIN:

Caused by: org.apache.kafka.common.KafkaException: org.apache.kafka.common.errors.SaslAuthenticationException: Failed to configure SaslClientAuthenticator
	... 25 common frames omitted
Caused by: org.apache.kafka.common.errors.SaslAuthenticationException: Failed to create SaslClient with mechanism PLAINTEXT

We’ll get a different error when the sasl.mechanism property is incorrectly named as security.mechanism:

Caused by: java.lang.IllegalArgumentException: No serviceName defined in either JAAS or Kafka config

Let’s verify the Kafka application with the entire setup.

4. Testing

We’ll use the Testcontainers framework to test the Kafka client application.

First, we’ll create a DockerComposeContainer object using the docker-compose.yml:

@Container
public DockerComposeContainer<?> container =
  new DockerComposeContainer<>("src/test/resources/sasl-plaintext/docker-compose.yml")
    .withExposedService("kafka", "9092", Wait.forListeningPort());

Next, let’s implement a test method to validate the consumer:

@Test
void givenSaslIsConfigured_whenProducerSendsMessageOverSasl_thenConsumerReceivesOverSasl() {
    String message = UUID.randomUUID().toString();
    kafkaProducer.sendMessage(message, "test-topic");

    await().atMost(Duration.ofMinutes(2))
      .untilAsserted(() -> assertThat(kafkaConsumer.messages).containsExactly(message));
}

Finally, we’ll run the test case and verify the output:

16:56:44.525 [kafka-producer-network-thread | producer-1] INFO c.b.saslplaintext.KafkaProducer - Message sent to topic: 82e8a804-0269-40a2-b8ed-c509e6951011
16:56:48.566 INFO  c.b.saslplaintext.KafkaConsumer - Received payload: ConsumerRecord(topic = test-topic, ... key = key, value = 82e8a804-0269-40a2-b8ed-c509e6951011

From the above logs, we can see that the consumer service has successfully received the message.

5. Conclusion

In this tutorial, we’ve learned how to set up SASL/PLAIN authentication in a Kafka service using JAAS config in a Docker environment.

We’ve also implemented producer/consumer services and configured the authentication using a similar JAAS config. Finally, we tested the entire setup by sending and receiving a message using a Docker TestContainer.

The code backing this article is available on GitHub. Once you're logged in as a Baeldung Pro Member, start learning and coding on the project.
Baeldung Pro – NPI EA (cat = Baeldung)
announcement - icon

Baeldung Pro comes with both absolutely No-Ads as well as finally with Dark Mode, for a clean learning experience:

>> Explore a clean Baeldung

Once the early-adopter seats are all used, the price will go up and stay at $33/year.

Partner – Orkes – NPI EA (cat = Spring)
announcement - icon

Modern software architecture is often broken. Slow delivery leads to missed opportunities, innovation is stalled due to architectural complexities, and engineering resources are exceedingly expensive.

Orkes is the leading workflow orchestration platform built to enable teams to transform the way they develop, connect, and deploy applications, microservices, AI agents, and more.

With Orkes Conductor managed through Orkes Cloud, developers can focus on building mission critical applications without worrying about infrastructure maintenance to meet goals and, simply put, taking new products live faster and reducing total cost of ownership.

Try a 14-Day Free Trial of Orkes Conductor today.

Partner – Orkes – NPI EA (tag = Microservices)
announcement - icon

Modern software architecture is often broken. Slow delivery leads to missed opportunities, innovation is stalled due to architectural complexities, and engineering resources are exceedingly expensive.

Orkes is the leading workflow orchestration platform built to enable teams to transform the way they develop, connect, and deploy applications, microservices, AI agents, and more.

With Orkes Conductor managed through Orkes Cloud, developers can focus on building mission critical applications without worrying about infrastructure maintenance to meet goals and, simply put, taking new products live faster and reducing total cost of ownership.

Try a 14-Day Free Trial of Orkes Conductor today.

eBook – HTTP Client – NPI EA (cat=HTTP Client-Side)
announcement - icon

The Apache HTTP Client is a very robust library, suitable for both simple and advanced use cases when testing HTTP endpoints. Check out our guide covering basic request and response handling, as well as security, cookies, timeouts, and more:

>> Download the eBook

eBook – Java Concurrency – NPI EA (cat=Java Concurrency)
announcement - icon

Handling concurrency in an application can be a tricky process with many potential pitfalls. A solid grasp of the fundamentals will go a long way to help minimize these issues.

Get started with understanding multi-threaded applications with our Java Concurrency guide:

>> Download the eBook

eBook – Java Streams – NPI EA (cat=Java Streams)
announcement - icon

Since its introduction in Java 8, the Stream API has become a staple of Java development. The basic operations like iterating, filtering, mapping sequences of elements are deceptively simple to use.

But these can also be overused and fall into some common pitfalls.

To get a better understanding on how Streams work and how to combine them with other language features, check out our guide to Java Streams:

>> Join Pro and download the eBook

eBook – Persistence – NPI EA (cat=Persistence)
announcement - icon

Working on getting your persistence layer right with Spring?

Explore the eBook

Course – LS – NPI EA (cat=REST)

announcement - icon

Get started with Spring Boot and with core Spring, through the Learn Spring course:

>> CHECK OUT THE COURSE

Partner – Moderne – NPI EA (tag=Refactoring)
announcement - icon

Modern Java teams move fast — but codebases don’t always keep up. Frameworks change, dependencies drift, and tech debt builds until it starts to drag on delivery. OpenRewrite was built to fix that: an open-source refactoring engine that automates repetitive code changes while keeping developer intent intact.

The monthly training series, led by the creators and maintainers of OpenRewrite at Moderne, walks through real-world migrations and modernization patterns. Whether you’re new to recipes or ready to write your own, you’ll learn practical ways to refactor safely and at scale.

If you’ve ever wished refactoring felt as natural — and as fast — as writing code, this is a good place to start.

Course – LSS – NPI (cat=Security/Spring Security)
announcement - icon

I just announced the new Learn Spring Security course, including the full material focused on the new OAuth2 stack in Spring Security:

>> CHECK OUT THE COURSE

eBook Jackson – NPI EA – 3 (cat = Jackson)