Partner – Microsoft – NPI EA (cat = Baeldung)
announcement - icon

Azure Container Apps is a fully managed serverless container service that enables you to build and deploy modern, cloud-native Java applications and microservices at scale. It offers a simplified developer experience while providing the flexibility and portability of containers.

Of course, Azure Container Apps has really solid support for our ecosystem, from a number of build options, managed Java components, native metrics, dynamic logger, and quite a bit more.

To learn more about Java features on Azure Container Apps, visit the documentation page.

You can also ask questions and leave feedback on the Azure Container Apps GitHub page.

Partner – Microsoft – NPI EA (cat= Spring Boot)
announcement - icon

Azure Container Apps is a fully managed serverless container service that enables you to build and deploy modern, cloud-native Java applications and microservices at scale. It offers a simplified developer experience while providing the flexibility and portability of containers.

Of course, Azure Container Apps has really solid support for our ecosystem, from a number of build options, managed Java components, native metrics, dynamic logger, and quite a bit more.

To learn more about Java features on Azure Container Apps, you can get started over on the documentation page.

And, you can also ask questions and leave feedback on the Azure Container Apps GitHub page.

Partner – Orkes – NPI EA (cat=Spring)
announcement - icon

Modern software architecture is often broken. Slow delivery leads to missed opportunities, innovation is stalled due to architectural complexities, and engineering resources are exceedingly expensive.

Orkes is the leading workflow orchestration platform built to enable teams to transform the way they develop, connect, and deploy applications, microservices, AI agents, and more.

With Orkes Conductor managed through Orkes Cloud, developers can focus on building mission critical applications without worrying about infrastructure maintenance to meet goals and, simply put, taking new products live faster and reducing total cost of ownership.

Try a 14-Day Free Trial of Orkes Conductor today.

Partner – Orkes – NPI EA (tag=Microservices)
announcement - icon

Modern software architecture is often broken. Slow delivery leads to missed opportunities, innovation is stalled due to architectural complexities, and engineering resources are exceedingly expensive.

Orkes is the leading workflow orchestration platform built to enable teams to transform the way they develop, connect, and deploy applications, microservices, AI agents, and more.

With Orkes Conductor managed through Orkes Cloud, developers can focus on building mission critical applications without worrying about infrastructure maintenance to meet goals and, simply put, taking new products live faster and reducing total cost of ownership.

Try a 14-Day Free Trial of Orkes Conductor today.

eBook – Guide Spring Cloud – NPI EA (cat=Spring Cloud)
announcement - icon

Let's get started with a Microservice Architecture with Spring Cloud:

>> Join Pro and download the eBook

eBook – Mockito – NPI EA (tag = Mockito)
announcement - icon

Mocking is an essential part of unit testing, and the Mockito library makes it easy to write clean and intuitive unit tests for your Java code.

Get started with mocking and improve your application tests using our Mockito guide:

Download the eBook

eBook – Java Concurrency – NPI EA (cat=Java Concurrency)
announcement - icon

Handling concurrency in an application can be a tricky process with many potential pitfalls. A solid grasp of the fundamentals will go a long way to help minimize these issues.

Get started with understanding multi-threaded applications with our Java Concurrency guide:

>> Download the eBook

eBook – Reactive – NPI EA (cat=Reactive)
announcement - icon

Spring 5 added support for reactive programming with the Spring WebFlux module, which has been improved upon ever since. Get started with the Reactor project basics and reactive programming in Spring Boot:

>> Join Pro and download the eBook

eBook – Java Streams – NPI EA (cat=Java Streams)
announcement - icon

Since its introduction in Java 8, the Stream API has become a staple of Java development. The basic operations like iterating, filtering, mapping sequences of elements are deceptively simple to use.

But these can also be overused and fall into some common pitfalls.

To get a better understanding on how Streams work and how to combine them with other language features, check out our guide to Java Streams:

>> Join Pro and download the eBook

eBook – Jackson – NPI EA (cat=Jackson)
announcement - icon

Do JSON right with Jackson

Download the E-book

eBook – HTTP Client – NPI EA (cat=Http Client-Side)
announcement - icon

Get the most out of the Apache HTTP Client

Download the E-book

eBook – Maven – NPI EA (cat = Maven)
announcement - icon

Get Started with Apache Maven:

Download the E-book

eBook – Persistence – NPI EA (cat=Persistence)
announcement - icon

Working on getting your persistence layer right with Spring?

Explore the eBook

eBook – RwS – NPI EA (cat=Spring MVC)
announcement - icon

Building a REST API with Spring?

Download the E-book

Course – LS – NPI EA (cat=Jackson)
announcement - icon

Get started with Spring and Spring Boot, through the Learn Spring course:

>> LEARN SPRING
Course – RWSB – NPI EA (cat=REST)
announcement - icon

Explore Spring Boot 3 and Spring 6 in-depth through building a full REST API with the framework:

>> The New “REST With Spring Boot”

Course – LSS – NPI EA (cat=Spring Security)
announcement - icon

Yes, Spring Security can be complex, from the more advanced functionality within the Core to the deep OAuth support in the framework.

I built the security material as two full courses - Core and OAuth, to get practical with these more complex scenarios. We explore when and how to use each feature and code through it on the backing project.

You can explore the course here:

>> Learn Spring Security

Course – LSD – NPI EA (tag=Spring Data JPA)
announcement - icon

Spring Data JPA is a great way to handle the complexity of JPA with the powerful simplicity of Spring Boot.

Get started with Spring Data JPA through the guided reference course:

>> CHECK OUT THE COURSE

Partner – MongoDB – NPI EA (tag=MongoDB)
announcement - icon

Traditional keyword-based search methods rely on exact word matches, often leading to irrelevant results depending on the user's phrasing.

By comparison, using a vector store allows us to represent the data as vector embeddings, based on meaningful relationships. We can then compare the meaning of the user’s query to the stored content, and retrieve more relevant, context-aware results.

Explore how to build an intelligent chatbot using MongoDB Atlas, Langchain4j and Spring Boot:

>> Building an AI Chatbot in Java With Langchain4j and MongoDB Atlas

1. Overview

Simply put, a CSV (Comma-Separated Values) file contains organized information separated by a comma delimiter.

In this tutorial, we’ll look into different options to read a CSV file into a Java array.

2. Sample CSV File

Let’s use a sample CSV file, book.csv:

Mary Kom,Unbreakable
Kapil Isapuari,Farishta

As a first choice, we should specify a comma delimiter to split a line into distinct values in a Java application:

public static final String COMMA_DELIMITER = ",";

It is important to note that no single delimiter character or regular expression can parse all types of values. This applies not just to values with embedded commas, but those with other embedded characters as well.

When we have a preexisting CSV file with the delimiter character or regular expression embedded in one or more values, we should change the delimiter character or regular expression if we can’t use any of the methods discussed in this tutorial.

When we create a new CSV file, we should choose the delimiter character or regular expression to be one that is not embedded in any of the values.

3. BufferedReader in java.io

First, let’s read the records line by line using readLine() in BufferedReader and then split each line into tokens based on the comma delimiter:

List<List<String>> records = new ArrayList<>();
try (BufferedReader br = new BufferedReader(new FileReader("book.csv"))) {
    String line;
    while ((line = br.readLine()) != null) {
        String[] values = line.split(COMMA_DELIMITER);
        records.add(Arrays.asList(values));
    }
}

4. Scanner in java.util

Next, let’s use a java.util.Scanner to run through the contents of the file and retrieve lines serially, one by one:

List<List<String>> records = new ArrayList<>();
try (Scanner scanner = new Scanner(new File("book.csv"))) {
    while (scanner.hasNextLine()) {
        records.add(getRecordFromLine(scanner.nextLine()));
    }
}

Next, let’s parse the lines and store them in an array:

private List<String> getRecordFromLine(String line) {
    List<String> values = new ArrayList<String>();
    try (Scanner rowScanner = new Scanner(line)) {
        rowScanner.useDelimiter(COMMA_DELIMITER);
        while (rowScanner.hasNext()) {
            values.add(rowScanner.next());
        }
    }
    return values;
}

5. Using Files Utility Class

Alternatively, we can use the Files class to achieve the same objective. This utility class consists of several static methods that operate on files and directories. So, let’s see how to use it in practice.

5.1. Using Files#lines

The lines() method is one of the enhancements introduced in Java 8. It allows us to read all lines of a given file as a stream. So, let’s see it in action:

try (Stream<String> lines = Files.lines(Paths.get(CSV_FILE))) {
    List<List<String>> records = lines.map(line -> Arrays.asList(line.split(COMMA_DELIMITER)))
      .collect(Collectors.toList());
}

Here, the Paths.get(CSV_FILE) method returns a Path instance, which denotes the path to the CSV file. Furthermore, we used the map() method to convert each line of the CSV file to a list of strings. Please note that we used a try-with-resources to ensure that the file is closed automatically at the end.

5.2. Using Files#readAllLines

Similarly, Files offers the readAllLines() method as another alternative to achieve the same outcome. This method, like lines(), accepts a Path object as a parameter and returns directly a list containing each line of the specified CSV file:

List<List<String>> records = Files.readAllLines(Paths.get(CSV_FILE))
  .stream()
  .map(line -> Arrays.asList(line.split(COMMA_DELIMITER)))
  .collect(Collectors.toList());

Notably, we used the stream API to read the CSV file into a List<List<String>>. An important caveat to mention here is that readAllLines() puts everything in memory at once, so don’t use it to read large files.

5.3. Using Files#newBufferedReader

Another option would be to use the newBufferedReader() method. It returns an instance of BufferedReader, which provides a way to read the file more efficiently.

Next, let’s learn how to use this method through an example:

try (BufferedReader reader = Files.newBufferedReader(Paths.get(CSV_FILE))) {
    List<List<String>> records = reader.lines()
      .map(line -> Arrays.asList(line.split(COMMA_DELIMITER)))
      .collect(Collectors.toList());
}

As shown above, we used the same logic as before to read the CSV file. Please note that newBufferedReader() is the best way to go when working with large files compared to other methods.

6. Reading Values With Embedded Commas

With more sophisticated CSVs that include commas in values, we can’t use a single, unmodified comma (,) as a delimiter with the BufferedReader, the Scanner, or the Files utility class. This is because it splits a line into distinct values using a comma; therefore, it splits the values themselves as it is not able to distinguish between a comma embedded within a value and a comma used to delimit two values.

To parse the comma-containing values as distinct values, we have several options:

  • Pad the comma delimiter with whitespaces to distinguish it from a comma embedded within a value
  • Use a custom CSV parser
  • Use an alternative delimiter

Let’s explore some of these alternatives.

6.1. Using a Custom CSV Parser

One option would be to use a custom CSV parser that reads line by line and uses a StringBuilder to fetch each value. An advantage of using this approach is being able to use a CSV file with commas embedded within values, along with a comma delimiter:

"Kom, Mary",Unbreakable
"Isapuari, Kapil",Farishta

Let’s learn how to use this method through an example:

List<List<String>> records = new ArrayList<List<String>>();
try (BufferedReader br = new BufferedReader(new FileReader(CSV_FILE))) {
    String line = "";
    while ((line = br.readLine()) != null) {
        records.add(parseLine(line));
    }
}

Next, let’s parse each line and store it in a List<String>:

private static List<String> parseLine(String line) {
    List values = new ArrayList<>();
    boolean inQuotes = false;
    StringBuilder currentValue = new StringBuilder();

    for (char c : line.toCharArray()) {
        if (c == '"') {
            inQuotes = !inQuotes;
        } else if (c == ',' && !inQuotes) {
            values.add(currentValue.toString());
            currentValue = new StringBuilder();
        } else {
            currentValue.append(c);
        }
    }
    values.add(currentValue.toString());
    return values;
}

6.2. Using an Alternative Parser

We can use a delimiter other than a comma, such as ‘|’, ‘/’, ‘\’, or ‘;’. Accordingly, let’s enclose values in double quotes in the CSV file, and use the pipe character “|” as the delimiter:

"Kom, Mary"|Unbreakable
"Isapuari, Kapil"|Farishta

Furthermore, to parse the CSV file, we need to make the delimiter the same as in the CSV file:

public static final String COMMA_DELIMITER = "\\|";

Let’s note that we need to escape special characters in a Java regular expression. With this regular expression, we can use a BufferedReader, Scanner, or the Files utility class to parse the sample CSV file into an ArrayList with elements as [“Kom, Mary”,Unbreakable], and [“Isapuari, Kapil”,Farishta].

We have the flexibility of using a different delimiter regular expression depending on how the commas and other characters are embedded within values. Even so, this approach has its limitations, for it can’t be used if the chosen delimiter character, such as the pipe character “|” in the example, also appears in any of the values.

7. OpenCSV

Let’s explore a more resilient approach to reading a CSV file. OpenCSV is a third-party library that provides an API to work with CSV files.

We’ll use the readNext() method in CSVReader to read the records in the file:

List<List<String>> records = new ArrayList<List<String>>();
try (CSVReader csvReader = new CSVReader(new FileReader("book.csv"));) {
    String[] values = null;
    while ((values = csvReader.readNext()) != null) {
        records.add(Arrays.asList(values));
    }
}

By default, we can use OpenCSV to read a CSV file with commas embedded within values without requiring an alternative delimiter:

"Kom, Mary",Unbreakable
"Isapuari, Kapil",Farishta

To dig deeper and learn more about OpenCSV, check out our OpenCSV tutorial.

8. Conclusion

In this quick article, we explored different ways to read CSV files into an array. Further, we explored the options to read a CSV file when a comma is embedded in values.

The code backing this article is available on GitHub. Once you're logged in as a Baeldung Pro Member, start learning and coding on the project.
Baeldung Pro – NPI EA (cat = Baeldung)
announcement - icon

Baeldung Pro comes with both absolutely No-Ads as well as finally with Dark Mode, for a clean learning experience:

>> Explore a clean Baeldung

Once the early-adopter seats are all used, the price will go up and stay at $33/year.

Partner – Microsoft – NPI EA (cat = Baeldung)
announcement - icon

Azure Container Apps is a fully managed serverless container service that enables you to build and deploy modern, cloud-native Java applications and microservices at scale. It offers a simplified developer experience while providing the flexibility and portability of containers.

Of course, Azure Container Apps has really solid support for our ecosystem, from a number of build options, managed Java components, native metrics, dynamic logger, and quite a bit more.

To learn more about Java features on Azure Container Apps, visit the documentation page.

You can also ask questions and leave feedback on the Azure Container Apps GitHub page.

Partner – Microsoft – NPI EA (cat = Spring Boot)
announcement - icon

Azure Container Apps is a fully managed serverless container service that enables you to build and deploy modern, cloud-native Java applications and microservices at scale. It offers a simplified developer experience while providing the flexibility and portability of containers.

Of course, Azure Container Apps has really solid support for our ecosystem, from a number of build options, managed Java components, native metrics, dynamic logger, and quite a bit more.

To learn more about Java features on Azure Container Apps, visit the documentation page.

You can also ask questions and leave feedback on the Azure Container Apps GitHub page.

Partner – Orkes – NPI EA (cat = Spring)
announcement - icon

Modern software architecture is often broken. Slow delivery leads to missed opportunities, innovation is stalled due to architectural complexities, and engineering resources are exceedingly expensive.

Orkes is the leading workflow orchestration platform built to enable teams to transform the way they develop, connect, and deploy applications, microservices, AI agents, and more.

With Orkes Conductor managed through Orkes Cloud, developers can focus on building mission critical applications without worrying about infrastructure maintenance to meet goals and, simply put, taking new products live faster and reducing total cost of ownership.

Try a 14-Day Free Trial of Orkes Conductor today.

Partner – Orkes – NPI EA (tag = Microservices)
announcement - icon

Modern software architecture is often broken. Slow delivery leads to missed opportunities, innovation is stalled due to architectural complexities, and engineering resources are exceedingly expensive.

Orkes is the leading workflow orchestration platform built to enable teams to transform the way they develop, connect, and deploy applications, microservices, AI agents, and more.

With Orkes Conductor managed through Orkes Cloud, developers can focus on building mission critical applications without worrying about infrastructure maintenance to meet goals and, simply put, taking new products live faster and reducing total cost of ownership.

Try a 14-Day Free Trial of Orkes Conductor today.

eBook – HTTP Client – NPI EA (cat=HTTP Client-Side)
announcement - icon

The Apache HTTP Client is a very robust library, suitable for both simple and advanced use cases when testing HTTP endpoints. Check out our guide covering basic request and response handling, as well as security, cookies, timeouts, and more:

>> Download the eBook

eBook – Java Concurrency – NPI EA (cat=Java Concurrency)
announcement - icon

Handling concurrency in an application can be a tricky process with many potential pitfalls. A solid grasp of the fundamentals will go a long way to help minimize these issues.

Get started with understanding multi-threaded applications with our Java Concurrency guide:

>> Download the eBook

eBook – Java Streams – NPI EA (cat=Java Streams)
announcement - icon

Since its introduction in Java 8, the Stream API has become a staple of Java development. The basic operations like iterating, filtering, mapping sequences of elements are deceptively simple to use.

But these can also be overused and fall into some common pitfalls.

To get a better understanding on how Streams work and how to combine them with other language features, check out our guide to Java Streams:

>> Join Pro and download the eBook

eBook – Persistence – NPI EA (cat=Persistence)
announcement - icon

Working on getting your persistence layer right with Spring?

Explore the eBook

Partner – MongoDB – NPI EA (tag=MongoDB)
announcement - icon

Traditional keyword-based search methods rely on exact word matches, often leading to irrelevant results depending on the user's phrasing.

By comparison, using a vector store allows us to represent the data as vector embeddings, based on meaningful relationships. We can then compare the meaning of the user’s query to the stored content, and retrieve more relevant, context-aware results.

Explore how to build an intelligent chatbot using MongoDB Atlas, Langchain4j and Spring Boot:

>> Building an AI Chatbot in Java With Langchain4j and MongoDB Atlas

Course – LS – NPI EA (cat=REST)

announcement - icon

Get started with Spring Boot and with core Spring, through the Learn Spring course:

>> CHECK OUT THE COURSE

eBook Jackson – NPI EA – 3 (cat = Jackson)