eBook – Guide Spring Cloud – NPI EA (cat=Spring Cloud)
announcement - icon

Let's get started with a Microservice Architecture with Spring Cloud:

>> Join Pro and download the eBook

eBook – Mockito – NPI EA (tag = Mockito)
announcement - icon

Mocking is an essential part of unit testing, and the Mockito library makes it easy to write clean and intuitive unit tests for your Java code.

Get started with mocking and improve your application tests using our Mockito guide:

Download the eBook

eBook – Java Concurrency – NPI EA (cat=Java Concurrency)
announcement - icon

Handling concurrency in an application can be a tricky process with many potential pitfalls. A solid grasp of the fundamentals will go a long way to help minimize these issues.

Get started with understanding multi-threaded applications with our Java Concurrency guide:

>> Download the eBook

eBook – Reactive – NPI EA (cat=Reactive)
announcement - icon

Spring 5 added support for reactive programming with the Spring WebFlux module, which has been improved upon ever since. Get started with the Reactor project basics and reactive programming in Spring Boot:

>> Join Pro and download the eBook

eBook – Java Streams – NPI EA (cat=Java Streams)
announcement - icon

Since its introduction in Java 8, the Stream API has become a staple of Java development. The basic operations like iterating, filtering, mapping sequences of elements are deceptively simple to use.

But these can also be overused and fall into some common pitfalls.

To get a better understanding on how Streams work and how to combine them with other language features, check out our guide to Java Streams:

>> Join Pro and download the eBook

eBook – Jackson – NPI EA (cat=Jackson)
announcement - icon

Do JSON right with Jackson

Download the E-book

eBook – HTTP Client – NPI EA (cat=Http Client-Side)
announcement - icon

Get the most out of the Apache HTTP Client

Download the E-book

eBook – Maven – NPI EA (cat = Maven)
announcement - icon

Get Started with Apache Maven:

Download the E-book

eBook – Persistence – NPI EA (cat=Persistence)
announcement - icon

Working on getting your persistence layer right with Spring?

Explore the eBook

eBook – RwS – NPI EA (cat=Spring MVC)
announcement - icon

Building a REST API with Spring?

Download the E-book

Course – LS – NPI EA (cat=Jackson)
announcement - icon

Get started with Spring and Spring Boot, through the Learn Spring course:

>> LEARN SPRING
Course – RWSB – NPI EA (cat=REST)
announcement - icon

Explore Spring Boot 3 and Spring 6 in-depth through building a full REST API with the framework:

>> The New “REST With Spring Boot”

Course – LSS – NPI EA (cat=Spring Security)
announcement - icon

Yes, Spring Security can be complex, from the more advanced functionality within the Core to the deep OAuth support in the framework.

I built the security material as two full courses - Core and OAuth, to get practical with these more complex scenarios. We explore when and how to use each feature and code through it on the backing project.

You can explore the course here:

>> Learn Spring Security

Course – LSD – NPI EA (tag=Spring Data JPA)
announcement - icon

Spring Data JPA is a great way to handle the complexity of JPA with the powerful simplicity of Spring Boot.

Get started with Spring Data JPA through the guided reference course:

>> CHECK OUT THE COURSE

Partner – Moderne – NPI EA (cat=Spring Boot)
announcement - icon

Refactor Java code safely — and automatically — with OpenRewrite.

Refactoring big codebases by hand is slow, risky, and easy to put off. That’s where OpenRewrite comes in. The open-source framework for large-scale, automated code transformations helps teams modernize safely and consistently.

Each month, the creators and maintainers of OpenRewrite at Moderne run live, hands-on training sessions — one for newcomers and one for experienced users. You’ll see how recipes work, how to apply them across projects, and how to modernize code with confidence.

Join the next session, bring your questions, and learn how to automate the kind of work that usually eats your sprint time.

Course – LJB – NPI EA (cat = Core Java)
announcement - icon

Code your way through and build up a solid, practical foundation of Java:

>> Learn Java Basics

Partner – LambdaTest – NPI EA (cat= Testing)
announcement - icon

Distributed systems often come with complex challenges such as service-to-service communication, state management, asynchronous messaging, security, and more.

Dapr (Distributed Application Runtime) provides a set of APIs and building blocks to address these challenges, abstracting away infrastructure so we can focus on business logic.

In this tutorial, we'll focus on Dapr's pub/sub API for message brokering. Using its Spring Boot integration, we'll simplify the creation of a loosely coupled, portable, and easily testable pub/sub messaging system:

>> Flexible Pub/Sub Messaging With Spring Boot and Dapr

1. Introduction

In Spring Batch, the CompositeItemReader is a tool for combining multiple ItemReader instances into a single reader. This is particularly useful when we need to read data from multiple sources or in a specific sequence. For example, we might want to read records from a database and a file simultaneously or process data from two different tables in a specific order.

The CompositeItemReader simplifies handling multiple readers in a batch job, ensuring efficient and flexible data processing. In this tutorial, we’ll go through the implementation of a CompositeItemReader in Spring Batch and look at examples and test cases to validate its behavior.

2. Understanding the CompositeItemReader

The CompositeItemReader works by delegating the reading process to a list of ItemReader instances. It reads items from each reader in the order they’re defined, ensuring that data is processed sequentially.

This is especially useful in scenarios like:

  • Reading from multiple databases or tables
  • Combining data from files and databases
  • Processing data from different sources in a specific sequence

Additionally, the CompositeItemReader is part of the org.springframework.batch.item.support package, and it was introduced in Spring Batch 5.2.0.

3. Implementing a CompositeItemReader

Let’s walk through an example where we read data from two different sources: a flat file and a database. The goal is to combine product data from both sources into a single stream for batch processing. Some products are in the flat file, while others are in the database, ensuring all available records are processed together.

3.1. Create Product Class

Before we set up the readers, we need a Product class that represents the structure of the data being processed. This class encapsulates details about a product, such as its ID, name, stock availability, and price. We’ll use this model while reading from both the CSV file and the database, ensuring consistency in data handling.

The Product class serves as a data transfer object (DTO) between our readers and the batch job:

public class Product {
    private Long productId;
    private String productName;
    private Integer stock;
    private BigDecimal price;

    public Product(Long productId, String productName, Integer stock, BigDecimal price) {
        this.productId = productId;
        this.productName = productName;
        this.stock = stock;
        this.price = price;
    }

    // Getters and Setters
}

The Product class represents each record that will be processed by our batch job. Now that our data model is ready, we’ll create individual ItemReader components for the CSV file and the database.

3.2. Flat File Reader for Product Data

The first reader fetches data from a CSV file using FlatFileItemReader. We configure it to read a delimited file (products.csv) and map its fields to the Product class:

@Bean
public FlatFileItemReader<Product> fileReader() {
  return new FlatFileItemReaderBuilder<Product>()
    .name("fileReader")
    .resource(new ClassPathResource("products.csv"))
    .delimited()
    .names("productId", "productName", "stock", "price")
    .linesToSkip(1)
    .targetType(Product.class)
    .build();
}

Here, the delimited() method ensures the data fields are separated using a delimiter (by default, a comma). The names() method defines the column names matching the attributes of the Product class, while the targetType(Product.class) method maps the fields to the class attributes.

3.3. Database Reader for Product Data

Next, we define a JdbcCursorItemReader to retrieve product data from a database table named products. This reader executes an SQL query to fetch product details and maps them to our Product class.

Below is the implementation of the database reader:

@Bean
public JdbcCursorItemReader<Product> dbReader(DataSource dataSource) {
  return new JdbcCursorItemReaderBuilder<Product>()
    .name("dbReader")
    .dataSource(dataSource())
    .sql("SELECT productid, productname, stock, price FROM products")
    .rowMapper((rs, rowNum) -> new Product(
      rs.getLong("productid"),
      rs.getString("productname"),
      rs.getInt("stock"),
      rs.getBigDecimal("price")))
    .build();
}

The JdbcCursorItemReader reads product records from the database one row at a time using a cursor, making it efficient for batch processing. The rowMapper() function maps each column from the result set to the corresponding field in the Product class.

4. Combining Readers Using CompositeItemReader

Now that both our CSV and database readers are configured to read product data, we can integrate them using CompositeItemReader:

@Bean
public CompositeItemReader<Product> compositeReader() {
    return new CompositeItemReader<>(Arrays.asList(fileReader(), dbReader()));
}

By configuring our CompositeItemReader, we can sequentially process data from multiple sources.

Initially, the FlatFileItemReader reads product records from the CSV file, parsing each row into a Product object. Once all rows from the file have been processed, the JdbcCursorItemReader takes over and starts fetching product data directly from the database.

5. Configuring the Batch Job

Once we’ve defined our readers for both the CSV file and the database, the next step is to configure the batch job itself. In Spring Batch, a job consists of multiple steps, where each step handles a specific part of the processing pipeline:

@Bean
public Job productJob(JobRepository jobRepository, Step step) {
  return new JobBuilder("productJob", jobRepository)
    .start(step)
    .build();
}

@Bean
public Step step(ItemReader compositeReader, ItemWriter productWriter) {
  return new StepBuilder("productStep", jobRepository)
    .<Product, Product>chunk(10, transactionManager)
    .reader(compositeReader)
    .writer(productWriter)
    .build();
}

In this case, our job contains a single step that reads product data, processes it in chunks of 10, and writes it to the desired output.

The productJob bean is responsible for defining the batch job. It starts execution with the productStep, which is configured to handle product data processing.

With this setup, our batch job first reads product data from both sources using the CompositeItemReader, processes it in chunks of ten, and writes the transformed or filtered data using productWriter(). This ensures a smooth and efficient batch processing pipeline.

6. Running the Batch Job

Now that we’ve configured the readers and the job, the next step is to run the batch job and observe the behavior of CompositeItemReader. We’ll run the job within a Spring Boot application to see how it processes data from both the CSV file and the database.

In order to trigger the batch job programmatically, we’ll need to use JobLauncher. This allows us to launch the job and monitor its progress:

@Bean
public CommandLineRunner runJob() {
    return args -> {
        try {
            jobLauncher.run(productJob, new JobParametersBuilder()
              .addLong("time", System.currentTimeMillis())
              .toJobParameters());
        } catch (Exception e) {
            // handle exception
        }
    };
}

In this example, we create a CommandLineRunner bean to run the job when the application starts. This invokes the productJob using JobLauncher. We also add unique JobParameters with a timestamp to ensure the job runs uniquely each time.

7. Testing the Composite Item Reader

To ensure the CompositeItemReader works as expected, we’ll test the functionality of the CompositeItemReader to ensure it reads products correctly from both CSV and database sources.

7.1. Preparing Test Data

We’ll first prepare a CSV file containing product data, which serves as the input for CompositeItemReader:

productId,productName,stock,price
101,Apple,50,1.99

Then, we also insert a record into the products table:

@BeforeEach
public void setUp() {
    jdbcTemplate.update("INSERT INTO products (productid, productname, stock, price) VALUES (?, ?, ?, ?)",
      102, "Banana", 30, 1.49);
}

7.2. Test the Sequential Reading Order

Now, we’ll test CompositeItemReader to verify that it processes products in the correct order, reading from both the CSV and the database sources. In this test, we read a product from the CSV file followed by the database and assert that the values match our expectations:

@Test
public void givenTwoReaders_whenRead_thenProcessProductsInOrder() throws Exception {
    StepExecution stepExecution = new StepExecution(
      "testStep",
      new JobExecution(1L, new JobParameters()),
      1L);
    ExecutionContext executionContext = stepExecution.getExecutionContext();
    compositeReader.open(executionContext);

    Product product1 = compositeReader.read();
    assertNotNull(product1);
    assertEquals(101, product1.getProductId());
    assertEquals("Apple", product1.getProductName());

    Product product2 = compositeReader.read();
    assertNotNull(product2);
    assertEquals(102, product2.getProductId());
    assertEquals("Banana", product2.getProductName());
}

7.3. Test With One Reader Returning Null Results

In this section, we test the behavior of CompositeItemReader when multiple readers are used and one of the readers returns null. This is important to ensure that CompositeItemReader skips over any readers that return no data and continues to the next reader until it finds valid data:

@Test
public void givenMultipleReader_whenOneReaderReturnNull_thenProcessDataFromNextReader() throws Exception {
    ItemStreamReader<Product> emptyReader = mock(ItemStreamReader.class);
    when(emptyReader.read()).thenReturn(null);

    ItemStreamReader<Product> validReader = mock(ItemStreamReader.class);
    when(validReader.read()).thenReturn(new Product(103L, "Cherry", 20, BigDecimal.valueOf(2.99)), null);

    CompositeItemReader<Product> compositeReader = new CompositeItemReader<>(
      Arrays.asList(emptyReader, validReader));

    Product product = compositeReader.read();
    assertNotNull(product);
    assertEquals(103, product.getProductId());
    assertEquals("Cherry", product.getProductName());
}

8. Conclusion

In this article, we learned how to implement and test a CompositeItemReader that allows us to process data from multiple sources in a specific sequence. By chaining readers together, we can process data from files, databases, or other sources in a specific sequence.

The code backing this article is available on GitHub. Once you're logged in as a Baeldung Pro Member, start learning and coding on the project.
Baeldung Pro – NPI EA (cat = Baeldung)
announcement - icon

Baeldung Pro comes with both absolutely No-Ads as well as finally with Dark Mode, for a clean learning experience:

>> Explore a clean Baeldung

Once the early-adopter seats are all used, the price will go up and stay at $33/year.

eBook – HTTP Client – NPI EA (cat=HTTP Client-Side)
announcement - icon

The Apache HTTP Client is a very robust library, suitable for both simple and advanced use cases when testing HTTP endpoints. Check out our guide covering basic request and response handling, as well as security, cookies, timeouts, and more:

>> Download the eBook

eBook – Java Concurrency – NPI EA (cat=Java Concurrency)
announcement - icon

Handling concurrency in an application can be a tricky process with many potential pitfalls. A solid grasp of the fundamentals will go a long way to help minimize these issues.

Get started with understanding multi-threaded applications with our Java Concurrency guide:

>> Download the eBook

eBook – Java Streams – NPI EA (cat=Java Streams)
announcement - icon

Since its introduction in Java 8, the Stream API has become a staple of Java development. The basic operations like iterating, filtering, mapping sequences of elements are deceptively simple to use.

But these can also be overused and fall into some common pitfalls.

To get a better understanding on how Streams work and how to combine them with other language features, check out our guide to Java Streams:

>> Join Pro and download the eBook

eBook – Persistence – NPI EA (cat=Persistence)
announcement - icon

Working on getting your persistence layer right with Spring?

Explore the eBook

Course – LS – NPI EA (cat=REST)

announcement - icon

Get started with Spring Boot and with core Spring, through the Learn Spring course:

>> CHECK OUT THE COURSE

Partner – Moderne – NPI EA (tag=Refactoring)
announcement - icon

Modern Java teams move fast — but codebases don’t always keep up. Frameworks change, dependencies drift, and tech debt builds until it starts to drag on delivery. OpenRewrite was built to fix that: an open-source refactoring engine that automates repetitive code changes while keeping developer intent intact.

The monthly training series, led by the creators and maintainers of OpenRewrite at Moderne, walks through real-world migrations and modernization patterns. Whether you’re new to recipes or ready to write your own, you’ll learn practical ways to refactor safely and at scale.

If you’ve ever wished refactoring felt as natural — and as fast — as writing code, this is a good place to start.

Course – LS – NPI – (cat=Spring)
announcement - icon

Get started with Spring Boot and with core Spring, through the Learn Spring course:

>> CHECK OUT THE COURSE

eBook Jackson – NPI EA – 3 (cat = Jackson)