Course – Black Friday 2025 – NPI EA (cat= Baeldung)
announcement - icon

Yes, we're now running our Black Friday Sale. All Access and Pro are 33% off until 2nd December, 2025:

>> EXPLORE ACCESS NOW

Partner – Orkes – NPI EA (cat=Spring)
announcement - icon

Modern software architecture is often broken. Slow delivery leads to missed opportunities, innovation is stalled due to architectural complexities, and engineering resources are exceedingly expensive.

Orkes is the leading workflow orchestration platform built to enable teams to transform the way they develop, connect, and deploy applications, microservices, AI agents, and more.

With Orkes Conductor managed through Orkes Cloud, developers can focus on building mission critical applications without worrying about infrastructure maintenance to meet goals and, simply put, taking new products live faster and reducing total cost of ownership.

Try a 14-Day Free Trial of Orkes Conductor today.

Partner – Orkes – NPI EA (tag=Microservices)
announcement - icon

Modern software architecture is often broken. Slow delivery leads to missed opportunities, innovation is stalled due to architectural complexities, and engineering resources are exceedingly expensive.

Orkes is the leading workflow orchestration platform built to enable teams to transform the way they develop, connect, and deploy applications, microservices, AI agents, and more.

With Orkes Conductor managed through Orkes Cloud, developers can focus on building mission critical applications without worrying about infrastructure maintenance to meet goals and, simply put, taking new products live faster and reducing total cost of ownership.

Try a 14-Day Free Trial of Orkes Conductor today.

eBook – Guide Spring Cloud – NPI EA (cat=Spring Cloud)
announcement - icon

Let's get started with a Microservice Architecture with Spring Cloud:

>> Join Pro and download the eBook

eBook – Mockito – NPI EA (tag = Mockito)
announcement - icon

Mocking is an essential part of unit testing, and the Mockito library makes it easy to write clean and intuitive unit tests for your Java code.

Get started with mocking and improve your application tests using our Mockito guide:

Download the eBook

eBook – Reactive – NPI EA (cat=Reactive)
announcement - icon

Spring 5 added support for reactive programming with the Spring WebFlux module, which has been improved upon ever since. Get started with the Reactor project basics and reactive programming in Spring Boot:

>> Join Pro and download the eBook

eBook – Java Streams – NPI EA (cat=Java Streams)
announcement - icon

Since its introduction in Java 8, the Stream API has become a staple of Java development. The basic operations like iterating, filtering, mapping sequences of elements are deceptively simple to use.

But these can also be overused and fall into some common pitfalls.

To get a better understanding on how Streams work and how to combine them with other language features, check out our guide to Java Streams:

>> Join Pro and download the eBook

eBook – Jackson – NPI EA (cat=Jackson)
announcement - icon

Do JSON right with Jackson

Download the E-book

eBook – HTTP Client – NPI EA (cat=Http Client-Side)
announcement - icon

Get the most out of the Apache HTTP Client

Download the E-book

eBook – Maven – NPI EA (cat = Maven)
announcement - icon

Get Started with Apache Maven:

Download the E-book

eBook – Persistence – NPI EA (cat=Persistence)
announcement - icon

Working on getting your persistence layer right with Spring?

Explore the eBook

eBook – RwS – NPI EA (cat=Spring MVC)
announcement - icon

Building a REST API with Spring?

Download the E-book

Course – LS – NPI EA (cat=Jackson)
announcement - icon

Get started with Spring and Spring Boot, through the Learn Spring course:

>> LEARN SPRING
Course – RWSB – NPI EA (cat=REST)
announcement - icon

Explore Spring Boot 3 and Spring 6 in-depth through building a full REST API with the framework:

>> The New “REST With Spring Boot”

Course – LSS – NPI EA (cat=Spring Security)
announcement - icon

Yes, Spring Security can be complex, from the more advanced functionality within the Core to the deep OAuth support in the framework.

I built the security material as two full courses - Core and OAuth, to get practical with these more complex scenarios. We explore when and how to use each feature and code through it on the backing project.

You can explore the course here:

>> Learn Spring Security

Partner – Orkes – NPI EA (cat=Java)
announcement - icon

Modern software architecture is often broken. Slow delivery leads to missed opportunities, innovation is stalled due to architectural complexities, and engineering resources are exceedingly expensive.

Orkes is the leading workflow orchestration platform built to enable teams to transform the way they develop, connect, and deploy applications, microservices, AI agents, and more.

With Orkes Conductor managed through Orkes Cloud, developers can focus on building mission critical applications without worrying about infrastructure maintenance to meet goals and, simply put, taking new products live faster and reducing total cost of ownership.

Try a 14-Day Free Trial of Orkes Conductor today.

Course – LSD – NPI EA (tag=Spring Data JPA)
announcement - icon

Spring Data JPA is a great way to handle the complexity of JPA with the powerful simplicity of Spring Boot.

Get started with Spring Data JPA through the guided reference course:

>> CHECK OUT THE COURSE

Partner – Moderne – NPI EA (cat=Spring Boot)
announcement - icon

Refactor Java code safely — and automatically — with OpenRewrite.

Refactoring big codebases by hand is slow, risky, and easy to put off. That’s where OpenRewrite comes in. The open-source framework for large-scale, automated code transformations helps teams modernize safely and consistently.

Each month, the creators and maintainers of OpenRewrite at Moderne run live, hands-on training sessions — one for newcomers and one for experienced users. You’ll see how recipes work, how to apply them across projects, and how to modernize code with confidence.

Join the next session, bring your questions, and learn how to automate the kind of work that usually eats your sprint time.

Course – Black Friday 2025 – NPI (cat=Baeldung)
announcement - icon

Yes, we're now running our Black Friday Sale. All Access and Pro are 33% off until 2nd December, 2025:

>> EXPLORE ACCESS NOW

1. Introduction

Apache Avro is a data serialization framework that provides rich data structures and a compact, fast, binary data format. When working with Avro in Java applications, we often need to serialize enum values. This can prove to be tricky if we don’t approach it correctly.

In this tutorial, we’ll explore how to properly serialize Java enum values using Avro. Furthermore, we’ll address common challenges we may face when working with enums in Avro.

2. Understanding Avro Enum Serialization

In Avro, enums are defined with a name and a set of symbols. When serializing Java enums, we must ensure our enum definition in the schema matches our Java enum definition.  This is important because Avro validates the enum values during serialization.

Avro uses a schema-based approach, meaning the schema defines the structure of the data, including field names, types, and, in the case of enums, the permitted symbol values. As such, the schema serves as a contract between the serializer and deserializer, thus helping with data consistency.

Let’s start by adding the necessary Avro Maven dependency to our project:

<dependency>
    <groupId>org.apache.avro</groupId>
    <artifactId>avro</artifactId>
    <version>1.12.0</version>
</dependency>

3. Defining Enums in Avro Schema

First, let’s look at how to correctly define an enum when creating an Avro schema:

Schema colorEnum = SchemaBuilder.enumeration("Color")
  .namespace("com.baeldung.apache.avro")
  .symbols("UNKNOWN", "GREEN", "RED", "BLUE");

This creates an enum schema with four available values. The namespace helps prevent naming conflicts. In addition, the symbols define the valid enum values.

Now, let’s use this enum in a record schema:

Schema recordSchema = SchemaBuilder.record("ColorRecord")
  .namespace("com.baeldung.apache.avro")
  .fields()
  .name("color")
  .type(colorEnum)
  .noDefault()
  .endRecord();

This initialization creates a record schema ColorRecord with a field named color of the Enum type we defined earlier.

4. Serializing Enum Values

Now that we’ve defined our enum schema, let’s explore how we can serialize the enum values.

In this section, we’ll discuss the standard approach for basic enum serialization. In addition, we’ll address the common challenge of handling enums within union types, which is often cause for confusion.

4.1. Correct Approach for Basic Enum Serialization

In order to correctly serialize an enum value, we’ll need to create an EnumSymbol object. As such, we’ll use the appropriate enum schema (colorEnum):

public void serializeEnumValue() throws IOException {
    GenericRecord record = new GenericData.Record(recordSchema);
    GenericData.EnumSymbol colorSymbol = new GenericData.EnumSymbol(colorEnum, "RED");
    record.put("color", colorSymbol);
    
    DatumWriter<GenericRecord> datumWriter = new GenericDatumWriter<>(recordSchema);
    try (DataFileWriter<GenericRecord> dataFileWriter = new DataFileWriter<>(datumWriter)) {
        dataFileWriter.create(recordSchema, new File("color.avro"));
        dataFileWriter.append(record);
    }
}

First, we create a GenericRecord based on our recordSchema. Next, we create an EnumSymbol with our enum schema(colorEnum) and the value “RED“. Finally, we add this to our record and serialize it to a temporary file using DatumWriter and DataFileWriter.

Now, let’s test our implementation:

@Test
void whenSerializingEnum_thenSuccess() throws IOException {
    File file = tempDir.resolve("color.avro").toFile();

    serializeEnumValue();

    DatumReader<GenericRecord> datumReader = new GenericDatumReader<>(recordSchema);
    try (DataFileReader<GenericRecord> dataFileReader = new DataFileReader<>(file, datumReader)) {
        GenericRecord result = dataFileReader.next();
        assertEquals("RED", result.get("color").toString());
    }
}

This test confirms that we can successfully serialize and deserialize an enum value.

4.2. Handling Union Types With Enums

Now, let’s see how we can handle a common issue we may face – serializing enums within union types:

Schema colorEnum = SchemaBuilder.enumeration("Color")
  .namespace("com.baeldung.apache.avro")
  .symbols("UNKNOWN", "GREEN", "RED", "BLUE");
    
Schema unionSchema = SchemaBuilder.unionOf()
  .type(colorEnum)
  .and()
  .nullType()
  .endUnion();
    
Schema recordWithUnionSchema = SchemaBuilder.record("ColorRecordWithUnion")
  .namespace("com.baeldung.apache.avro")
  .fields()
  .name("color")
  .type(unionSchema)
  .noDefault()
  .endRecord();

Let’s analyze the defined schemas. We’ve defined a union schema that can be either our enum type or null. This pattern is common when a field is optional. Next, we’ve created a record schema with a field using this union type.

As such, when we serialize an enum within a union, we’ll still use the EnumSymbol but with the correct schema reference:

GenericRecord record = new GenericData.Record(recordWithUnionSchema);
GenericData.EnumSymbol colorSymbol = new GenericData.EnumSymbol(colorEnum, "RED");
record.put("color", colorSymbol);

One important aspect we need to keep in mind here is that we’ve created the EnumSymbol with the enum schema, not the union schema. This is a common mistake that leads to serialization errors.

Now, let’s test our implementation for the union handling:

@Test
void whenSerializingEnumInUnion_thenSuccess() throws IOException {
    File file = tempDir.resolve("colorUnion.avro").toFile();

    GenericRecord record = new GenericData.Record(recordWithUnionSchema);
    GenericData.EnumSymbol colorSymbol = new GenericData.EnumSymbol(colorEnum, "GREEN");
    record.put("color", colorSymbol);

    DatumWriter<GenericRecord> datumWriter = new GenericDatumWriter<>(recordWithUnionSchema);
    try (DataFileWriter<GenericRecord> dataFileWriter = new DataFileWriter<>(datumWriter)) {
        dataFileWriter.create(recordWithUnionSchema, file);
        dataFileWriter.append(record);
    }

    DatumReader<GenericRecord> datumReader = new GenericDatumReader<>(recordWithUnionSchema);
    try (DataFileReader<GenericRecord> dataFileReader = new DataFileReader<>(file, datumReader)) {
        GenericRecord result = dataFileReader.next();
        assertEquals("GREEN", result.get("color").toString());
    }
}

Furthermore, we can also test handling null values in the union:

@Test
void whenSerializingNullInUnion_thenSuccess() throws IOException {
    File file = tempDir.resolve("colorNull.avro").toFile();

    GenericRecord record = new GenericData.Record(recordWithUnionSchema);
    record.put("color", null);

    DatumWriter<GenericRecord> datumWriter = new GenericDatumWriter<>(recordWithUnionSchema);
    assertDoesNotThrow(() -> {
        try (DataFileWriter<GenericRecord> dataFileWriter = new DataFileWriter<>(datumWriter)) {
            dataFileWriter.create(recordWithUnionSchema, file);
            dataFileWriter.append(record);
        }
    });
}

5. Schema Evolution With Enums

Schema evolution is a particularly sensitive area when dealing with enums, as adding or removing enum values can lead to compatibility issues. In this section, we’ll explore how to update our data structures as requirements change. We’ll focus on working with enum types and maintaining backward compatibility through proper default value configuration.

5.1. Adding New Enum Values

When we have to expand our schema, adding new enum values requires careful consideration. We’ll need to keep compatibility issues in mind. As such, for backward compatibility, adding a default value is crucial:

@Test
void whenSchemaEvolution_thenDefaultValueUsed() throws IOException {
    String evolvedSchemaJson = "{\"type\":\"record\",
                                 \"name\":\"ColorRecord\",
                                 \"namespace\":\"com.baeldung.apache.avro\",
                                 \"fields\":
                                   [{\"name\":\"color\",
                                     \"type\":
                                        {\"type\":\"enum\",
                                         \"name\":\"Color\",
                                     \"symbols\":[\"UNKNOWN\",\"GREEN\",\"RED\",\"BLUE\",\"YELLOW\"],
                                         \"default\":\"UNKNOWN\"
                                   }}]
                                 }";
    
    Schema evolvedRecordSchema = new Schema.Parser().parse(evolvedSchemaJson);
    Schema evolvedEnum = evolvedRecordSchema.getField("color").schema();
    
    File file = tempDir.resolve("colorEvolved.avro").toFile();

    GenericRecord record = new GenericData.Record(evolvedRecordSchema);
    GenericData.EnumSymbol colorSymbol = new GenericData.EnumSymbol(evolvedEnum, "YELLOW");
    record.put("color", colorSymbol);

    DatumWriter<GenericRecord> datumWriter = new GenericDatumWriter<>(evolvedRecordSchema);
    try (DataFileWriter<GenericRecord> dataFileWriter = new DataFileWriter<>(datumWriter)) {
        dataFileWriter.create(evolvedRecordSchema, file);
        dataFileWriter.append(record);
    }
    
    String originalSchemaJson = "{\"type\":\"record\",
                                  \"name\":\"ColorRecord\",
                                  \"namespace\":\"com.baeldung.apache.avro\",
                                  \"fields\":[{
                                     \"name\":\"color\",
                                     \"type\":
                                         {\"type\":\"enum\",
                                          \"name\":\"Color\",
                                          \"symbols\":[\"UNKNOWN\",\"GREEN\",\"RED\",\"BLUE\"],
                                          \"default\":\"UNKNOWN\"}}]
                                 }";
    
    Schema originalRecordSchema = new Schema.Parser().parse(originalSchemaJson);
    
    DatumReader<GenericRecord> datumReader = 
                    new GenericDatumReader<>(evolvedRecordSchema, originalRecordSchema);
    try (DataFileReader<GenericRecord> dataFileReader = new DataFileReader<>(file, datumReader)) {
        GenericRecord result = dataFileReader.next();
        assertEquals("UNKNOWN", result.get("color").toString());
    }
}

Now, let’s analyze the code above. We’ve evolved our schema (evolvedSchemaJson)  and added a new symbol, “YELLOW“. Next, we’ve created a record with the “YELLOW” enum value, and we’ve written it in a file.

Then, we’ve created an “original schema” (originalSchemaJson) but with the same default value. Lest we forget, earlier we had noted that adding a default value is important for backwards compatibility.

Finally, when we’re reading the data with the original schema, we’re verifying that the default value “UNKNOWN” is used instead of “YELLOW“.

For proper schema evolution with enums, we’ll need to specify the default value at the enum type level, rather than at the field level. For our example, this is why we’re using a JSON string to define our schemas, as it gives us direct control over the structure.

6. Conclusion

In this article, we’ve explored how to properly serialize enum values using Apache Avro. We’ve looked at basic enum serialization, handling unions with enums, and addressing schema evolution challenges.

When working with enums in Avro, we should remember some key points. First, we’ll need to define our enum schema with the correct namespace and symbols. Using GenericData.EnumSymbol with the appropriate enum schema reference is important.

Furthermore, for union types, we create the enum symbol with the enum schema, not the union schema.

Lastly, regarding schema evolution, we need to place the default value at the enum type level for appropriate compatibility.

The code backing this article is available on GitHub. Once you're logged in as a Baeldung Pro Member, start learning and coding on the project.
Course – Black Friday 2025 – NPI EA (cat= Baeldung)
announcement - icon

Yes, we're now running our Black Friday Sale. All Access and Pro are 33% off until 2nd December, 2025:

>> EXPLORE ACCESS NOW

Partner – Orkes – NPI EA (cat = Spring)
announcement - icon

Modern software architecture is often broken. Slow delivery leads to missed opportunities, innovation is stalled due to architectural complexities, and engineering resources are exceedingly expensive.

Orkes is the leading workflow orchestration platform built to enable teams to transform the way they develop, connect, and deploy applications, microservices, AI agents, and more.

With Orkes Conductor managed through Orkes Cloud, developers can focus on building mission critical applications without worrying about infrastructure maintenance to meet goals and, simply put, taking new products live faster and reducing total cost of ownership.

Try a 14-Day Free Trial of Orkes Conductor today.

Partner – Orkes – NPI EA (tag = Microservices)
announcement - icon

Modern software architecture is often broken. Slow delivery leads to missed opportunities, innovation is stalled due to architectural complexities, and engineering resources are exceedingly expensive.

Orkes is the leading workflow orchestration platform built to enable teams to transform the way they develop, connect, and deploy applications, microservices, AI agents, and more.

With Orkes Conductor managed through Orkes Cloud, developers can focus on building mission critical applications without worrying about infrastructure maintenance to meet goals and, simply put, taking new products live faster and reducing total cost of ownership.

Try a 14-Day Free Trial of Orkes Conductor today.

eBook – HTTP Client – NPI EA (cat=HTTP Client-Side)
announcement - icon

The Apache HTTP Client is a very robust library, suitable for both simple and advanced use cases when testing HTTP endpoints. Check out our guide covering basic request and response handling, as well as security, cookies, timeouts, and more:

>> Download the eBook

eBook – Java Concurrency – NPI EA (cat=Java Concurrency)
announcement - icon

Handling concurrency in an application can be a tricky process with many potential pitfalls. A solid grasp of the fundamentals will go a long way to help minimize these issues.

Get started with understanding multi-threaded applications with our Java Concurrency guide:

>> Download the eBook

eBook – Java Streams – NPI EA (cat=Java Streams)
announcement - icon

Since its introduction in Java 8, the Stream API has become a staple of Java development. The basic operations like iterating, filtering, mapping sequences of elements are deceptively simple to use.

But these can also be overused and fall into some common pitfalls.

To get a better understanding on how Streams work and how to combine them with other language features, check out our guide to Java Streams:

>> Join Pro and download the eBook

eBook – Persistence – NPI EA (cat=Persistence)
announcement - icon

Working on getting your persistence layer right with Spring?

Explore the eBook

Course – LS – NPI EA (cat=REST)

announcement - icon

Get started with Spring Boot and with core Spring, through the Learn Spring course:

>> CHECK OUT THE COURSE

Partner – Moderne – NPI EA (tag=Refactoring)
announcement - icon

Modern Java teams move fast — but codebases don’t always keep up. Frameworks change, dependencies drift, and tech debt builds until it starts to drag on delivery. OpenRewrite was built to fix that: an open-source refactoring engine that automates repetitive code changes while keeping developer intent intact.

The monthly training series, led by the creators and maintainers of OpenRewrite at Moderne, walks through real-world migrations and modernization patterns. Whether you’re new to recipes or ready to write your own, you’ll learn practical ways to refactor safely and at scale.

If you’ve ever wished refactoring felt as natural — and as fast — as writing code, this is a good place to start.

Course – Black Friday 2025 – NPI (All)
announcement - icon

Yes, we're now running our Black Friday Sale. All Access and Pro are 33% off until 2nd December, 2025:

>> EXPLORE ACCESS NOW

eBook Jackson – NPI EA – 3 (cat = Jackson)