eBook – Guide Spring Cloud – NPI EA (cat=Spring Cloud)
announcement - icon

Let's get started with a Microservice Architecture with Spring Cloud:

>> Join Pro and download the eBook

eBook – Mockito – NPI EA (tag = Mockito)
announcement - icon

Mocking is an essential part of unit testing, and the Mockito library makes it easy to write clean and intuitive unit tests for your Java code.

Get started with mocking and improve your application tests using our Mockito guide:

Download the eBook

eBook – Java Concurrency – NPI EA (cat=Java Concurrency)
announcement - icon

Handling concurrency in an application can be a tricky process with many potential pitfalls. A solid grasp of the fundamentals will go a long way to help minimize these issues.

Get started with understanding multi-threaded applications with our Java Concurrency guide:

>> Download the eBook

eBook – Reactive – NPI EA (cat=Reactive)
announcement - icon

Spring 5 added support for reactive programming with the Spring WebFlux module, which has been improved upon ever since. Get started with the Reactor project basics and reactive programming in Spring Boot:

>> Join Pro and download the eBook

eBook – Java Streams – NPI EA (cat=Java Streams)
announcement - icon

Since its introduction in Java 8, the Stream API has become a staple of Java development. The basic operations like iterating, filtering, mapping sequences of elements are deceptively simple to use.

But these can also be overused and fall into some common pitfalls.

To get a better understanding on how Streams work and how to combine them with other language features, check out our guide to Java Streams:

>> Join Pro and download the eBook

eBook – Jackson – NPI EA (cat=Jackson)
announcement - icon

Do JSON right with Jackson

Download the E-book

eBook – HTTP Client – NPI EA (cat=Http Client-Side)
announcement - icon

Get the most out of the Apache HTTP Client

Download the E-book

eBook – Maven – NPI EA (cat = Maven)
announcement - icon

Get Started with Apache Maven:

Download the E-book

eBook – Persistence – NPI EA (cat=Persistence)
announcement - icon

Working on getting your persistence layer right with Spring?

Explore the eBook

eBook – RwS – NPI EA (cat=Spring MVC)
announcement - icon

Building a REST API with Spring?

Download the E-book

Course – LS – NPI EA (cat=Jackson)
announcement - icon

Get started with Spring and Spring Boot, through the Learn Spring course:

>> LEARN SPRING
Course – RWSB – NPI EA (cat=REST)
announcement - icon

Explore Spring Boot 3 and Spring 6 in-depth through building a full REST API with the framework:

>> The New “REST With Spring Boot”

Course – LSS – NPI EA (cat=Spring Security)
announcement - icon

Yes, Spring Security can be complex, from the more advanced functionality within the Core to the deep OAuth support in the framework.

I built the security material as two full courses - Core and OAuth, to get practical with these more complex scenarios. We explore when and how to use each feature and code through it on the backing project.

You can explore the course here:

>> Learn Spring Security

Course – LSD – NPI EA (tag=Spring Data JPA)
announcement - icon

Spring Data JPA is a great way to handle the complexity of JPA with the powerful simplicity of Spring Boot.

Get started with Spring Data JPA through the guided reference course:

>> CHECK OUT THE COURSE

Partner – Moderne – NPI EA (cat=Spring Boot)
announcement - icon

Refactor Java code safely — and automatically — with OpenRewrite.

Refactoring big codebases by hand is slow, risky, and easy to put off. That’s where OpenRewrite comes in. The open-source framework for large-scale, automated code transformations helps teams modernize safely and consistently.

Each month, the creators and maintainers of OpenRewrite at Moderne run live, hands-on training sessions — one for newcomers and one for experienced users. You’ll see how recipes work, how to apply them across projects, and how to modernize code with confidence.

Join the next session, bring your questions, and learn how to automate the kind of work that usually eats your sprint time.

Course – LJB – NPI EA (cat = Core Java)
announcement - icon

Code your way through and build up a solid, practical foundation of Java:

>> Learn Java Basics

Partner – LambdaTest – NPI EA (cat= Testing)
announcement - icon

Distributed systems often come with complex challenges such as service-to-service communication, state management, asynchronous messaging, security, and more.

Dapr (Distributed Application Runtime) provides a set of APIs and building blocks to address these challenges, abstracting away infrastructure so we can focus on business logic.

In this tutorial, we'll focus on Dapr's pub/sub API for message brokering. Using its Spring Boot integration, we'll simplify the creation of a loosely coupled, portable, and easily testable pub/sub messaging system:

>> Flexible Pub/Sub Messaging With Spring Boot and Dapr

1. Introduction

In text encoding, the Byte Order Mark (BOM) is a special marker at the beginning of a file that indicates its byte order and encoding scheme. For UTF-8 encoding, the BOM is a sequence of three bytes: 0xEF, 0xBB, and 0xBF. Additionally, these bytes serve as a signal to software that the file is encoded using UTF-8.

In this tutorial, we’ll explore different methods to add a UTF-8 BOM to a file in Java, examining both byte-level and text-level approaches and ensuring consistency in handling and explaining the BOM.

2. Understanding the UTF-8 BOM

The UTF-8 BOM indicates that a file is encoded in UTF-8 through a special sequence of bytes. Although it isn’t mandatory, including the BOM can be crucial in certain situations, especially when working with older software or specific platforms that rely on it to detect the encoding format.

As we mentioned above, the UTF-8 BOM consists of three bytes in hexadecimal: 0xEF, 0xBB, and 0xBF.

Additionally, the Unicode character \uFEFF, known as the Zero-Width No-Break Space (ZWNBSP), also represents this sequence. This Unicode character signals the presence of the BOM and serves the same function as the byte sequence.

To ensure consistency in our code, we’ll define both the byte sequence and the Unicode representation as constants throughout this tutorial:

private static final byte[] UTF8_BOM = {(byte) 0xEF, (byte) 0xBB, (byte) 0xBF};
private static final String UTF8_BOM_UNICODE = "\uFEFF";

Throughout this tutorial, we’ll add the BOM to files using either raw bytes or with the Zero-Width No-Break Space character depending on whether we’re working with bytes or strings.

3. Using FileOutputStream and the write Method

One of the simplest methods to add a BOM to a file is to use Java’s FileOutputStream, which allows us to write raw bytes directly. This approach provides control over the exact byte sequence in the file, making it suitable for low-level, byte-oriented file operations.

First, let’s manually write the BOM bytes at the beginning of the file as 0xEF, 0xBB, and 0xBF, followed by the UTF-8 encoded content:

private static final String FILE_PATH_OUTPUT_STREAM = "output_with_bom.txt";
private static final String TEST_CONTENT = "This is the content of the file";

@Test
public void givenText_whenAddingBomWithFileOutputStream_thenBOMAdded() throws IOException {
    try (FileOutputStream fos = new FileOutputStream(FILE_PATH_OUTPUT_STREAM)) {
        fos.write(UTF8_BOM);
        fos.write(TEST_CONTENT.getBytes(StandardCharsets.UTF_8));
    }

    String result = Files.readString(Path.of(FILE_PATH_OUTPUT_STREAM), StandardCharsets.UTF_8);
    assertTrue(result.startsWith(UTF8_BOM_UNICODE));
    assertTrue(result.contains(TEST_CONTENT));
}

We first define the file path, content, and the byte array representing the UTF-8 BOM. Then, we open the file using FileOutputStream inside a try-with-resources block, ensuring the stream automatically closes after we finish with it.

Next, we write() the BOM bytes to the file, followed by the UTF-8 encoded content.

Finally, we read the file back using Files.readString() and ensure that the file starts with the BOM and contains the expected file content.

Note that this approach operates at the byte level. Reading the content back automatically converts the BOM bytes to its Unicode equivalent.

4. Writing UTF-8 with BOM Using Java Writers

When writing UTF-8 files with a BOM in Java, we can leverage writers that wrap output streams. BufferedWriter and PrintWriter allow us to add the BOM as we write the file content. These approaches handle encoding and provide higher-level abstractions for easier file output.

4.1. Using BufferedWriter and OutputStreamWriter

Using BufferedWriter with OutputStreamWriter offers a high-level approach for managing the BOM in UTF-8 files:

private static final String FILE_PATH_BUFFERED_WRITER = "output_with_bom_buffered.txt";

@Test
public void givenText_whenAddingBomWithBufferedWriter_thenBOMAdded() throws IOException {
    try (OutputStreamWriter osw = new OutputStreamWriter(
            new FileOutputStream(FILE_PATH_BUFFERED_WRITER), StandardCharsets.UTF_8);
         BufferedWriter writer = new BufferedWriter(osw)) {

        writer.write(UTF8_BOM_UNICODE);
        writer.write(TEST_CONTENT);
    }

    String result = Files.readString(Path.of(FILE_PATH_BUFFERED_WRITER), StandardCharsets.UTF_8);
    assertTrue(result.startsWith(UTF8_BOM_UNICODE));
    assertTrue(result.contains(TEST_CONTENT));
}

In this method, we open the file with a FileOutputStream and create a channel to write to it with an OutputStreamWriter, which also lets us specify the UTF-8 encoding.

Then, we wrap the stream in a BufferedWriter, which allows us to write to the file in a controlled manner. We write the BOM using the Unicode escape sequence UTF8_BOM_UNICODE at the start of the file, followed by the actual content.

Finally, we read the content back to verify that the BOM is at the start of the file followed by our contents.

This method is preferable for cases where text files and higher-level encoding management are the priority.

4.2. Using PrintWriter with OutputStreamWriter

Another option involves using PrintWriter with OutputStreamWriter. This approach offers a convenient text output format, especially for structured text:

private static final String FILE_PATH_PRINT_WRITER = "output_with_bom_print_writer.txt";

@Test
public void givenText_whenUsingPrintWriter_thenBOMAdded() throws IOException {
    try (PrintWriter writer = new PrintWriter(
            new OutputStreamWriter(
              new FileOutputStream(FILE_PATH_PRINT_WRITER), StandardCharsets.UTF_8))) {
        writer.write(UTF8_BOM_UNICODE);
        writer.println(TEST_CONTENT);
    }

    String result = Files.readString(Path.of(FILE_PATH_PRINT_WRITER), StandardCharsets.UTF_8);
    assertTrue(result.startsWith(UTF8_BOM_UNICODE));
    assertTrue(result.contains(TEST_CONTENT));
}

Here, the OutputStreamWriter specifies the UTF-8 encoding again while the PrintWriter provides a convenient method for writing structured text. We use write() to manually add the BOM using UTF8_BOM_UNICODE, followed by the println() method for the content.

5. Using Apache Commons IO

Apache Commons IO simplifies file handling, and we can leverage its utility methods to handle writing content with a BOM. While we still need to add the BOM manually, the library’s utility methods simplify writing and reading files:

private static final String FILE_PATH_COMMONS_IO = "output_with_bom_commons_io.txt";

@Test
public void givenText_whenUsingCommonsIO_thenBOMAdded() throws IOException {
    byte[] bomAndContent = ArrayUtils.addAll(
      UTF8_BOM,
      TEST_CONTENT.getBytes(StandardCharsets.UTF_8)
    );
    FileUtils.writeByteArrayToFile(new File(FILE_PATH_COMMONS_IO), bomAndContent);

    String result = FileUtils.readFileToString(
      new File(FILE_PATH_COMMONS_IO), StandardCharsets.UTF_8
    );
    assertTrue(result.startsWith(UTF8_BOM_UNICODE));
    assertTrue(result.contains(TEST_CONTENT));
}

We combine the BOM bytes with the content bytes in an array using ArrayUtils.addAll() from Apache Commons Lang. Then, we use FileUtils.writeByteArrayToFile() from Apache Commons IO to write the BOM and content in one step.

FileUtils.readFileToString() reads the entire file into a string, letting us verify the BOM and content. Note that we add the BOM as raw bytes, but it’s interpreted as the Unicode character when read back.

This approach is particularly effective for scenarios where Apache Commons libraries are already in use, as they provide efficient methods for file I/O while simplifying BOM management.

6. Conclusion

In this article, we’ve explored various methods for adding a UTF-8 Byte Order Mark (BOM) to a file in Java.

We started with the basic approach, using FileOutputStream to write the BOM bytes. Then we combined OutputStreamWriter with BufferedWriter or PrintWriter to manage the BOM.

Finally, we used third-party libraries like Apache Commons IO for simplified file handling.

The code backing this article is available on GitHub. Once you're logged in as a Baeldung Pro Member, start learning and coding on the project.
Baeldung Pro – NPI EA (cat = Baeldung)
announcement - icon

Baeldung Pro comes with both absolutely No-Ads as well as finally with Dark Mode, for a clean learning experience:

>> Explore a clean Baeldung

Once the early-adopter seats are all used, the price will go up and stay at $33/year.

eBook – HTTP Client – NPI EA (cat=HTTP Client-Side)
announcement - icon

The Apache HTTP Client is a very robust library, suitable for both simple and advanced use cases when testing HTTP endpoints. Check out our guide covering basic request and response handling, as well as security, cookies, timeouts, and more:

>> Download the eBook

eBook – Java Concurrency – NPI EA (cat=Java Concurrency)
announcement - icon

Handling concurrency in an application can be a tricky process with many potential pitfalls. A solid grasp of the fundamentals will go a long way to help minimize these issues.

Get started with understanding multi-threaded applications with our Java Concurrency guide:

>> Download the eBook

eBook – Java Streams – NPI EA (cat=Java Streams)
announcement - icon

Since its introduction in Java 8, the Stream API has become a staple of Java development. The basic operations like iterating, filtering, mapping sequences of elements are deceptively simple to use.

But these can also be overused and fall into some common pitfalls.

To get a better understanding on how Streams work and how to combine them with other language features, check out our guide to Java Streams:

>> Join Pro and download the eBook

eBook – Persistence – NPI EA (cat=Persistence)
announcement - icon

Working on getting your persistence layer right with Spring?

Explore the eBook

Course – LS – NPI EA (cat=REST)

announcement - icon

Get started with Spring Boot and with core Spring, through the Learn Spring course:

>> CHECK OUT THE COURSE

Partner – Moderne – NPI EA (tag=Refactoring)
announcement - icon

Modern Java teams move fast — but codebases don’t always keep up. Frameworks change, dependencies drift, and tech debt builds until it starts to drag on delivery. OpenRewrite was built to fix that: an open-source refactoring engine that automates repetitive code changes while keeping developer intent intact.

The monthly training series, led by the creators and maintainers of OpenRewrite at Moderne, walks through real-world migrations and modernization patterns. Whether you’re new to recipes or ready to write your own, you’ll learn practical ways to refactor safely and at scale.

If you’ve ever wished refactoring felt as natural — and as fast — as writing code, this is a good place to start.

eBook Jackson – NPI EA – 3 (cat = Jackson)
1 Comment
Oldest
Newest
Inline Feedbacks
View all comments