Partner – Orkes – NPI EA (cat=Spring)
announcement - icon

Modern software architecture is often broken. Slow delivery leads to missed opportunities, innovation is stalled due to architectural complexities, and engineering resources are exceedingly expensive.

Orkes is the leading workflow orchestration platform built to enable teams to transform the way they develop, connect, and deploy applications, microservices, AI agents, and more.

With Orkes Conductor managed through Orkes Cloud, developers can focus on building mission critical applications without worrying about infrastructure maintenance to meet goals and, simply put, taking new products live faster and reducing total cost of ownership.

Try a 14-Day Free Trial of Orkes Conductor today.

Partner – Orkes – NPI EA (tag=Microservices)
announcement - icon

Modern software architecture is often broken. Slow delivery leads to missed opportunities, innovation is stalled due to architectural complexities, and engineering resources are exceedingly expensive.

Orkes is the leading workflow orchestration platform built to enable teams to transform the way they develop, connect, and deploy applications, microservices, AI agents, and more.

With Orkes Conductor managed through Orkes Cloud, developers can focus on building mission critical applications without worrying about infrastructure maintenance to meet goals and, simply put, taking new products live faster and reducing total cost of ownership.

Try a 14-Day Free Trial of Orkes Conductor today.

eBook – Guide Spring Cloud – NPI EA (cat=Spring Cloud)
announcement - icon

Let's get started with a Microservice Architecture with Spring Cloud:

>> Join Pro and download the eBook

eBook – Mockito – NPI EA (tag = Mockito)
announcement - icon

Mocking is an essential part of unit testing, and the Mockito library makes it easy to write clean and intuitive unit tests for your Java code.

Get started with mocking and improve your application tests using our Mockito guide:

Download the eBook

eBook – Java Concurrency – NPI EA (cat=Java Concurrency)
announcement - icon

Handling concurrency in an application can be a tricky process with many potential pitfalls. A solid grasp of the fundamentals will go a long way to help minimize these issues.

Get started with understanding multi-threaded applications with our Java Concurrency guide:

>> Download the eBook

eBook – Reactive – NPI EA (cat=Reactive)
announcement - icon

Spring 5 added support for reactive programming with the Spring WebFlux module, which has been improved upon ever since. Get started with the Reactor project basics and reactive programming in Spring Boot:

>> Join Pro and download the eBook

eBook – Java Streams – NPI EA (cat=Java Streams)
announcement - icon

Since its introduction in Java 8, the Stream API has become a staple of Java development. The basic operations like iterating, filtering, mapping sequences of elements are deceptively simple to use.

But these can also be overused and fall into some common pitfalls.

To get a better understanding on how Streams work and how to combine them with other language features, check out our guide to Java Streams:

>> Join Pro and download the eBook

eBook – Jackson – NPI EA (cat=Jackson)
announcement - icon

Do JSON right with Jackson

Download the E-book

eBook – HTTP Client – NPI EA (cat=Http Client-Side)
announcement - icon

Get the most out of the Apache HTTP Client

Download the E-book

eBook – Maven – NPI EA (cat = Maven)
announcement - icon

Get Started with Apache Maven:

Download the E-book

eBook – Persistence – NPI EA (cat=Persistence)
announcement - icon

Working on getting your persistence layer right with Spring?

Explore the eBook

eBook – RwS – NPI EA (cat=Spring MVC)
announcement - icon

Building a REST API with Spring?

Download the E-book

Course – LS – NPI EA (cat=Jackson)
announcement - icon

Get started with Spring and Spring Boot, through the Learn Spring course:

>> LEARN SPRING
Course – RWSB – NPI EA (cat=REST)
announcement - icon

Explore Spring Boot 3 and Spring 6 in-depth through building a full REST API with the framework:

>> The New “REST With Spring Boot”

Course – LSS – NPI EA (cat=Spring Security)
announcement - icon

Yes, Spring Security can be complex, from the more advanced functionality within the Core to the deep OAuth support in the framework.

I built the security material as two full courses - Core and OAuth, to get practical with these more complex scenarios. We explore when and how to use each feature and code through it on the backing project.

You can explore the course here:

>> Learn Spring Security

Partner – LambdaTest – NPI EA (cat=Testing)
announcement - icon

Browser testing is essential if you have a website or web applications that users interact with. Manual testing can be very helpful to an extent, but given the multiple browsers available, not to mention versions and operating system, testing everything manually becomes time-consuming and repetitive.

To help automate this process, Selenium is a popular choice for developers, as an open-source tool with a large and active community. What's more, we can further scale our automation testing by running on theLambdaTest cloud-based testing platform.

Read more through our step-by-step tutorial on how to set up Selenium tests with Java and run them on LambdaTest:

>> Automated Browser Testing With Selenium

Partner – Orkes – NPI EA (cat=Java)
announcement - icon

Modern software architecture is often broken. Slow delivery leads to missed opportunities, innovation is stalled due to architectural complexities, and engineering resources are exceedingly expensive.

Orkes is the leading workflow orchestration platform built to enable teams to transform the way they develop, connect, and deploy applications, microservices, AI agents, and more.

With Orkes Conductor managed through Orkes Cloud, developers can focus on building mission critical applications without worrying about infrastructure maintenance to meet goals and, simply put, taking new products live faster and reducing total cost of ownership.

Try a 14-Day Free Trial of Orkes Conductor today.

Course – LSD – NPI EA (tag=Spring Data JPA)
announcement - icon

Spring Data JPA is a great way to handle the complexity of JPA with the powerful simplicity of Spring Boot.

Get started with Spring Data JPA through the guided reference course:

>> CHECK OUT THE COURSE

Partner – Moderne – NPI EA (cat=Spring Boot)
announcement - icon

Refactor Java code safely — and automatically — with OpenRewrite.

Refactoring big codebases by hand is slow, risky, and easy to put off. That’s where OpenRewrite comes in. The open-source framework for large-scale, automated code transformations helps teams modernize safely and consistently.

Each month, the creators and maintainers of OpenRewrite at Moderne run live, hands-on training sessions — one for newcomers and one for experienced users. You’ll see how recipes work, how to apply them across projects, and how to modernize code with confidence.

Join the next session, bring your questions, and learn how to automate the kind of work that usually eats your sprint time.

1. Overview

Apache Kafka has established itself as one of the most popular and widely used messaging systems for building event-driven architectures, where one microservice publishes a message to a topic, which another microservice consumes and processes asynchronously.

However, there are scenarios where a response is required immediately by the publisher microservice to proceed with further processing. While Kafka is inherently designed for asynchronous communication, it can be configured to support synchronous request-reply communication through separate topics.

In this tutorial, we’ll explore how to implement synchronous request-reply communication in a Spring Boot application using Apache Kafka.

2. Setting up the Project

For our demonstration, we’ll simulate a notification dispatch system. We’ll create a single Spring Boot application that will act as both the producer and the consumer.

2.1. Dependencies

Let’s start by adding the Spring Kafka dependency to our project’s pom.xml file:

<dependency>
    <groupId>org.springframework.kafka</groupId>
    <artifactId>spring-kafka</artifactId>
    <version>3.3.4</version>
</dependency>

This dependency provides us with the necessary classes to establish a connection and interact with the provisioned Kafka instance.

2.2. Defining Request-Reply Messages

Next, let’s define two records to represent our request and reply messages:

record NotificationDispatchRequest(String emailId, String content) {
}

public record NotificationDispatchResponse(UUID notificationId) {
}

Here, the NotificationDispatchRequest record holds the emailId and content of the notification, while the NotificationDispatchResponse record contains a unique notificationId that is generated after processing the request.

2.3. Defining Kafka Topics and Configuration Properties

Now, let’s define our request and reply Kafka topics. Additionally, we’ll configure a timeout duration for receiving a reply from the consumer component.

We’ll store these properties in our project’s application.yaml file and use @ConfigurationProperties to map the values to a Java record, which our configuration and service layers can reference:

@Validated
@ConfigurationProperties(prefix = "com.baeldung.kafka.synchronous")
record SynchronousKafkaProperties(
    @NotBlank
    String requestTopic,

    @NotBlank
    String replyTopic,

    @NotNull @DurationMin(seconds = 10) @DurationMax(minutes = 2)
    Duration replyTimeout
) {
}

We’ve also added validation annotations to ensure all the required properties are configured correctly. If any of the defined validations fail, the Spring ApplicationContext will fail to start up. This allows us to conform to the fail-fast principle.

Below is a snippet of our application.yaml file, which defines the required properties that will be mapped to our SynchronousKafkaProperties record automatically:

com:
  baeldung:
    kafka:
      synchronous:
        request-topic: notification-dispatch-request
        reply-topic: notification-dispatch-response
        reply-timeout: 30s

Here, we configure our request and reply Kafka topic names along with a reply-timeout of thirty seconds.

In addition to our custom properties, let’s add a few core Kafka configuration properties to our application.yaml file as well:

spring:
  kafka:
    bootstrap-servers: ${KAFKA_BOOTSTRAP_SERVERS}
    producer:
      key-serializer: org.apache.kafka.common.serialization.StringSerializer
      value-serializer: org.springframework.kafka.support.serializer.JsonSerializer
    consumer:
      group-id: synchronous-kafka-group
      key-deserializer: org.apache.kafka.common.serialization.StringDeserializer
      value-deserializer: org.springframework.kafka.support.serializer.JsonDeserializer
      properties:
        spring:
          json:
            trusted:
              packages: com.baeldung.kafka.synchronous
    properties:
      allow:
        auto:
          create:
            topics: true

First, to allow our application to connect to the provisioned Kafka instance, we configure its bootstrap server URL using an environment variable.

Next, we configure the key and value serialization and deserialization properties for both the consumer and producer. Additionally, for our consumer, we configure a group-id and trust the package containing our request-reply records for JSON deserialization.

On configuring the above properties, Spring Kafka automatically creates beans of type ConsumerFactory and ProducerFactory for us. We’ll use them to define additional Kafka configuration beans in the next section.

Lastly, we enable auto-creation of topics, so Kafka automatically creates them if they don’t exist. It’s important to note that we’ve only enabled this property for our demonstration — the same should not be done in production applications.

2.4. Defining Kafka Configuration Beans

With our configuration properties in place, let’s define the necessary Kafka configuration beans:

@Bean
KafkaMessageListenerContainer<String, NotificationDispatchResponse> kafkaMessageListenerContainer(
    ConsumerFactory<String, NotificationDispatchResponse> consumerFactory
) {
    String replyTopic = synchronousKafkaProperties.replyTopic();
    ContainerProperties containerProperties = new ContainerProperties(replyTopic);
    return new KafkaMessageListenerContainer<>(consumerFactory, containerProperties);
}

First, we inject the ConsumerFactory instance and use it along with the configured replyTopic to create a KafkaMessageListenerContainer bean. This bean is responsible for creating a container that polls messages from our reply topic.

Next, we’ll define the core bean that we’ll use in our service layer to perform synchronous communication:

@Bean
ReplyingKafkaTemplate<String, NotificationDispatchRequest, NotificationDispatchResponse> replyingKafkaTemplate(
    ProducerFactory<String, NotificationDispatchRequest> producerFactory,
    KafkaMessageListenerContainer<String, NotificationDispatchResponse> kafkaMessageListenerContainer
) {
    Duration replyTimeout = synchronousKafkaProperties.replyTimeout();
    var replyingKafkaTemplate = new ReplyingKafkaTemplate<>(producerFactory, kafkaMessageListenerContainer);
    replyingKafkaTemplate.setDefaultReplyTimeout(replyTimeout);
    return replyingKafkaTemplate;
}

Using the ProducerFactory and the earlier defined KafkaMessageListenerContainer bean, we create a ReplyingKafkaTemplate bean. Additionally, using the autowired synchronousKafkaProperties, we configure the reply-timeout that we’ve defined in our application.yaml file, which will determine how long our service will wait for a response before timing out.

This ReplyingKafkaTemplate bean manages the interactions between the request and reply topics, making synchronous communication over Kafka possible.

Lastly, let’s define beans to enable our listener component to send responses back to the reply topic:

@Bean
KafkaTemplate<String, NotificationDispatchResponse> kafkaTemplate(ProducerFactory<String, NotificationDispatchResponse> producerFactory) {
    return new KafkaTemplate<>(producerFactory);
}

@Bean
KafkaListenerContainerFactory<ConcurrentMessageListenerContainer<String, NotificationDispatchRequest>> kafkaListenerContainerFactory(
    ConsumerFactory<String, NotificationDispatchRequest> consumerFactory,
    KafkaTemplate<String, NotificationDispatchResponse> kafkaTemplate
) {
    var factory = new ConcurrentKafkaListenerContainerFactory<String, NotificationDispatchRequest>();
    factory.setConsumerFactory(consumerFactory);
    factory.setReplyTemplate(kafkaTemplate);
    return factory;
}

First, we create a standard KafkaTemplate bean using the ProducerFactory bean.

Then, we use it along with the ConsumerFactory bean to define the KafkaListenerContainerFactory bean. This bean enables our listener components that consume messages from the request topic to send a message back to the reply topic after the required processing is completed.

3. Implementing Synchronous Communication With Kafka

With our configuration in place, let’s implement a synchronous request-reply communication between our two configured Kafka topics.

3.1. Sending and Receiving Messages Using ReplyingKafkaTemplate

First, let’s create a NotificationDispatchService class that sends messages to the configured request topic using the ReplyingKafkaTemplate bean we defined earlier:

@Service
@EnableConfigurationProperties(SynchronousKafkaProperties.class)
class NotificationDispatchService {

    private final SynchronousKafkaProperties synchronousKafkaProperties;
    private final ReplyingKafkaTemplate<String, NotificationDispatchRequest, NotificationDispatchResponse> replyingKafkaTemplate;

    // standard constructor

    NotificationDispatchResponse dispatch(NotificationDispatchRequest notificationDispatchRequest) {
        String requestTopic = synchronousKafkaProperties.requestTopic();
        ProducerRecord<String, NotificationDispatchRequest> producerRecord = new ProducerRecord<>(requestTopic, notificationDispatchRequest);

        var requestReplyFuture = replyingKafkaTemplate.sendAndReceive(producerRecord);
        return requestReplyFuture.get().value();
    }
}

Here, in our dispatch() method, we use the autowired synchronousKafkaProperties instance to extract the requestTopic configured in our application.yaml file. Then, we use it along with the notificationDispatchRequest passed in the method’s argument to create a ProducerRecord instance.

Next, we pass the created ProducerRecord instance to the sendAndReceive() method to publish the message to the request topic. The method returns a RequestReplyFuture object, which we use to wait for a response back and then return its value.

Under the hood, when we call the sendAndReceive() method, the ReplyingKafkaTemplate class generates a unique correlation ID, which is a random UUID, and attaches it to the outgoing message’s header. Additionally, it adds a header containing the reply topic name in which it expects the response back. Remember that we’ve already configured the reply topic in the KafkaMessageListenerContainer bean.

The ReplyingKafkaTemplate bean uses the generated correlation ID as a key to store the RequestReplyFuture object in a thread-safe ConcurrentHashMap. This allows it to work even in multi-threaded environments and support concurrent requests.

3.2. Defining the Kafka Message Listener

Next, to complete our implementation, let’s create a listener component that listens to messages in the configured request topic and sends back a response to the reply topic:

@Component
class NotificationDispatchListener {

    @SendTo
    @KafkaListener(topics = "${com.baeldung.kafka.synchronous.request-topic}")
    NotificationDispatchResponse listen(NotificationDispatchRequest notificationDispatchRequest) {
        // ... processing logic
        UUID notificationId = UUID.randomUUID();
        return new NotificationDispatchResponse(notificationId);
    }
}

We use the @KafkaListener annotation to listen to the request topic configured in our application.yaml file.

Inside our listen() method, we simply return a NotificationDispatchResponse record containing a unique notificationId.

Importantly, we annotate our method with the @SendTo annotation, which instructs Spring Kafka to extract the correlation ID and reply topic name from the message headers. It uses them to automatically send the method’s return value to the extracted reply topic and adds the same correlation ID to the message header.

This allows the ReplyingKafkaTemplate bean in our NotificationDispatchService class to fetch the correct RequestReplyFuture object using the correlation ID it originally generated.

4. Conclusion

In this article, we’ve explored using Apache Kafka to implement synchronous communication between two components in a Spring Boot application.

We walked through the necessary configurations and simulated a notification dispatch system.

By using ReplyingKafkaTemplate, we can convert the asynchronous nature of Apache Kafka into a synchronous request-reply pattern. This approach is a little unconventional, so it’s important to carefully evaluate whether it aligns with the project’s architecture before implementing it in production.

The code backing this article is available on GitHub. Once you're logged in as a Baeldung Pro Member, start learning and coding on the project.
Baeldung Pro – NPI EA (cat = Baeldung)
announcement - icon

Baeldung Pro comes with both absolutely No-Ads as well as finally with Dark Mode, for a clean learning experience:

>> Explore a clean Baeldung

Once the early-adopter seats are all used, the price will go up and stay at $33/year.

Partner – Orkes – NPI EA (cat = Spring)
announcement - icon

Modern software architecture is often broken. Slow delivery leads to missed opportunities, innovation is stalled due to architectural complexities, and engineering resources are exceedingly expensive.

Orkes is the leading workflow orchestration platform built to enable teams to transform the way they develop, connect, and deploy applications, microservices, AI agents, and more.

With Orkes Conductor managed through Orkes Cloud, developers can focus on building mission critical applications without worrying about infrastructure maintenance to meet goals and, simply put, taking new products live faster and reducing total cost of ownership.

Try a 14-Day Free Trial of Orkes Conductor today.

Partner – Orkes – NPI EA (tag = Microservices)
announcement - icon

Modern software architecture is often broken. Slow delivery leads to missed opportunities, innovation is stalled due to architectural complexities, and engineering resources are exceedingly expensive.

Orkes is the leading workflow orchestration platform built to enable teams to transform the way they develop, connect, and deploy applications, microservices, AI agents, and more.

With Orkes Conductor managed through Orkes Cloud, developers can focus on building mission critical applications without worrying about infrastructure maintenance to meet goals and, simply put, taking new products live faster and reducing total cost of ownership.

Try a 14-Day Free Trial of Orkes Conductor today.

eBook – HTTP Client – NPI EA (cat=HTTP Client-Side)
announcement - icon

The Apache HTTP Client is a very robust library, suitable for both simple and advanced use cases when testing HTTP endpoints. Check out our guide covering basic request and response handling, as well as security, cookies, timeouts, and more:

>> Download the eBook

eBook – Java Concurrency – NPI EA (cat=Java Concurrency)
announcement - icon

Handling concurrency in an application can be a tricky process with many potential pitfalls. A solid grasp of the fundamentals will go a long way to help minimize these issues.

Get started with understanding multi-threaded applications with our Java Concurrency guide:

>> Download the eBook

eBook – Java Streams – NPI EA (cat=Java Streams)
announcement - icon

Since its introduction in Java 8, the Stream API has become a staple of Java development. The basic operations like iterating, filtering, mapping sequences of elements are deceptively simple to use.

But these can also be overused and fall into some common pitfalls.

To get a better understanding on how Streams work and how to combine them with other language features, check out our guide to Java Streams:

>> Join Pro and download the eBook

eBook – Persistence – NPI EA (cat=Persistence)
announcement - icon

Working on getting your persistence layer right with Spring?

Explore the eBook

Course – LS – NPI EA (cat=REST)

announcement - icon

Get started with Spring Boot and with core Spring, through the Learn Spring course:

>> CHECK OUT THE COURSE

Partner – Moderne – NPI EA (tag=Refactoring)
announcement - icon

Modern Java teams move fast — but codebases don’t always keep up. Frameworks change, dependencies drift, and tech debt builds until it starts to drag on delivery. OpenRewrite was built to fix that: an open-source refactoring engine that automates repetitive code changes while keeping developer intent intact.

The monthly training series, led by the creators and maintainers of OpenRewrite at Moderne, walks through real-world migrations and modernization patterns. Whether you’re new to recipes or ready to write your own, you’ll learn practical ways to refactor safely and at scale.

If you’ve ever wished refactoring felt as natural — and as fast — as writing code, this is a good place to start.

eBook Jackson – NPI EA – 3 (cat = Jackson)
4 Comments
Oldest
Newest
Inline Feedbacks
View all comments