Course – Black Friday 2025 – NPI EA (cat= Baeldung)
announcement - icon

Yes, we're now running our Black Friday Sale. All Access and Pro are 33% off until 2nd December, 2025:

>> EXPLORE ACCESS NOW

Partner – Orkes – NPI EA (cat=Spring)
announcement - icon

Modern software architecture is often broken. Slow delivery leads to missed opportunities, innovation is stalled due to architectural complexities, and engineering resources are exceedingly expensive.

Orkes is the leading workflow orchestration platform built to enable teams to transform the way they develop, connect, and deploy applications, microservices, AI agents, and more.

With Orkes Conductor managed through Orkes Cloud, developers can focus on building mission critical applications without worrying about infrastructure maintenance to meet goals and, simply put, taking new products live faster and reducing total cost of ownership.

Try a 14-Day Free Trial of Orkes Conductor today.

Partner – Orkes – NPI EA (tag=Microservices)
announcement - icon

Modern software architecture is often broken. Slow delivery leads to missed opportunities, innovation is stalled due to architectural complexities, and engineering resources are exceedingly expensive.

Orkes is the leading workflow orchestration platform built to enable teams to transform the way they develop, connect, and deploy applications, microservices, AI agents, and more.

With Orkes Conductor managed through Orkes Cloud, developers can focus on building mission critical applications without worrying about infrastructure maintenance to meet goals and, simply put, taking new products live faster and reducing total cost of ownership.

Try a 14-Day Free Trial of Orkes Conductor today.

eBook – Guide Spring Cloud – NPI EA (cat=Spring Cloud)
announcement - icon

Let's get started with a Microservice Architecture with Spring Cloud:

>> Join Pro and download the eBook

eBook – Mockito – NPI EA (tag = Mockito)
announcement - icon

Mocking is an essential part of unit testing, and the Mockito library makes it easy to write clean and intuitive unit tests for your Java code.

Get started with mocking and improve your application tests using our Mockito guide:

Download the eBook

eBook – Reactive – NPI EA (cat=Reactive)
announcement - icon

Spring 5 added support for reactive programming with the Spring WebFlux module, which has been improved upon ever since. Get started with the Reactor project basics and reactive programming in Spring Boot:

>> Join Pro and download the eBook

eBook – Java Streams – NPI EA (cat=Java Streams)
announcement - icon

Since its introduction in Java 8, the Stream API has become a staple of Java development. The basic operations like iterating, filtering, mapping sequences of elements are deceptively simple to use.

But these can also be overused and fall into some common pitfalls.

To get a better understanding on how Streams work and how to combine them with other language features, check out our guide to Java Streams:

>> Join Pro and download the eBook

eBook – Jackson – NPI EA (cat=Jackson)
announcement - icon

Do JSON right with Jackson

Download the E-book

eBook – HTTP Client – NPI EA (cat=Http Client-Side)
announcement - icon

Get the most out of the Apache HTTP Client

Download the E-book

eBook – Maven – NPI EA (cat = Maven)
announcement - icon

Get Started with Apache Maven:

Download the E-book

eBook – Persistence – NPI EA (cat=Persistence)
announcement - icon

Working on getting your persistence layer right with Spring?

Explore the eBook

eBook – RwS – NPI EA (cat=Spring MVC)
announcement - icon

Building a REST API with Spring?

Download the E-book

Course – LS – NPI EA (cat=Jackson)
announcement - icon

Get started with Spring and Spring Boot, through the Learn Spring course:

>> LEARN SPRING
Course – RWSB – NPI EA (cat=REST)
announcement - icon

Explore Spring Boot 3 and Spring 6 in-depth through building a full REST API with the framework:

>> The New “REST With Spring Boot”

Course – LSS – NPI EA (cat=Spring Security)
announcement - icon

Yes, Spring Security can be complex, from the more advanced functionality within the Core to the deep OAuth support in the framework.

I built the security material as two full courses - Core and OAuth, to get practical with these more complex scenarios. We explore when and how to use each feature and code through it on the backing project.

You can explore the course here:

>> Learn Spring Security

Partner – Orkes – NPI EA (cat=Java)
announcement - icon

Modern software architecture is often broken. Slow delivery leads to missed opportunities, innovation is stalled due to architectural complexities, and engineering resources are exceedingly expensive.

Orkes is the leading workflow orchestration platform built to enable teams to transform the way they develop, connect, and deploy applications, microservices, AI agents, and more.

With Orkes Conductor managed through Orkes Cloud, developers can focus on building mission critical applications without worrying about infrastructure maintenance to meet goals and, simply put, taking new products live faster and reducing total cost of ownership.

Try a 14-Day Free Trial of Orkes Conductor today.

Course – LSD – NPI EA (tag=Spring Data JPA)
announcement - icon

Spring Data JPA is a great way to handle the complexity of JPA with the powerful simplicity of Spring Boot.

Get started with Spring Data JPA through the guided reference course:

>> CHECK OUT THE COURSE

Partner – Moderne – NPI EA (cat=Spring Boot)
announcement - icon

Refactor Java code safely — and automatically — with OpenRewrite.

Refactoring big codebases by hand is slow, risky, and easy to put off. That’s where OpenRewrite comes in. The open-source framework for large-scale, automated code transformations helps teams modernize safely and consistently.

Each month, the creators and maintainers of OpenRewrite at Moderne run live, hands-on training sessions — one for newcomers and one for experienced users. You’ll see how recipes work, how to apply them across projects, and how to modernize code with confidence.

Join the next session, bring your questions, and learn how to automate the kind of work that usually eats your sprint time.

Course – Black Friday 2025 – NPI (cat=Baeldung)
announcement - icon

Yes, we're now running our Black Friday Sale. All Access and Pro are 33% off until 2nd December, 2025:

>> EXPLORE ACCESS NOW

1. Introduction

In this tutorial, we’ll explore different approaches to serializing and deserializing Date objects in Java, using Apache Avro. This framework is a data serialization system that provides a compact, fast, binary data format along with schema-based data definition.

When working with dates in Avro, we face challenges because Avro doesn’t natively support the Java Date class in its type structure. Now, let’s look at the challenge with Date serialization in more detail.

2. The Challenge With Date Serialization

Before we get started, let’s add the Avro dependency to our Maven project:

<dependency>
    <groupId>org.apache.avro</groupId>
    <artifactId>avro</artifactId>
    <version>1.12.0</version>
</dependency>

Avro’s type system consists of primitive types: null, boolean, int, long, float, double, bytes, and string. In addition, the supported complex types are: record, enum, array, map, union, and fixed.

Now, let’s look at an example of why date serialization can be problematic in Avro:

public class DateContainer {
    private Date date;
    
    // Constructors, getters, and setters
}

When we try to directly serialize this class using Avro’s reflection-based serialization, the default behavior converts internally the Date object to a long value (millisecond since epoch).

Unfortunately, this process can lead to precision issues. For example, the deserialized value could be off by a few milliseconds from the original counterpart.

3. Implementing Date Serialization

Next, we’ll implement Date serialization and deserialization using two methods: using logical types with GenericRecord and using Avro’s Conversion API.

3.1. Using Logical Types With GenericRecord

Since Avro 1.8, the framework provides logical types. These add the necessary and appropriate meaning to the underlying primitive types.

As such, for dates we have three logical types:

  1. date: represents a date without time. It’s stored as an int (days since epoch).
  2. timestamp-millis: represents a timestamp with millisecond precision, stored as a long
  3. timestamp-micros: represents a timestamp with microsecond precision, stored as a long

Now, let’s see how to use these logical types in an Avro schema:

public static Schema createDateSchema() {
    String schemaJson = 
        "{"
        + "\"type\": \"record\","
        + "\"name\": \"DateRecord\","
        + "\"fields\": ["
        + "  {\"name\": \"date\", \"type\": {\"type\": \"int\", \"logicalType\": \"date\"}},"
        + "  {\"name\": \"timestamp\", \"type\": {\"type\": \"long\", \"logicalType\": \"timestamp-millis\"}}"
        + "]"
        + "}";
    return new Schema.Parser().parse(schemaJson);
}

Notably, we’ve applied the logical type to the underlying primitive type, not directly to the field.

Now, let’s look at how we can implement Date serialization using logical types:

public static byte[] serializeDateWithLogicalType(LocalDate date, Instant timestamp) {
    Schema schema = createDateSchema();
    GenericRecord record = new GenericData.Record(schema);
    
    record.put("date", (int) date.toEpochDay());
    
    record.put("timestamp", timestamp.toEpochMilli());
    
    ByteArrayOutputStream baos = new ByteArrayOutputStream();
    DatumWriter<GenericRecord> datumWriter = new GenericDatumWriter<>(schema);
    Encoder encoder = EncoderFactory.get().binaryEncoder(baos, null);
    
    datumWriter.write(record, encoder);
    encoder.flush();
    
    return baos.toByteArray();
}

Let’s go over the above logic. We convert the LocalDate to days since epoch and the timestamp to milliseconds since epoch. This way, we’re able to use the logical types.

Now, let’s implement the method that handles the deserialization:

public static Pair<LocalDate, Instant> deserializeDateWithLogicalType(byte[] bytes) {
    Schema schema = createDateSchema();
    DatumReader<GenericRecord> datumReader = new GenericDatumReader<>(schema);
    Decoder decoder = DecoderFactory.get().binaryDecoder(bytes, null);
    
    GenericRecord record = datumReader.read(null, decoder);
    
    LocalDate date = LocalDate.ofEpochDay((int) record.get("date"));
    
    Instant timestamp = Instant.ofEpochMilli((long) record.get("timestamp"));
    
    return Pair.of(date, timestamp);
}

Finally, let’s test our implementation:

@Test
void whenSerializingDateWithLogicalType_thenDeserializesCorrectly() {

    LocalDate expectedDate = LocalDate.now();
    Instant expectedTimestamp = Instant.now();

    byte[] serialized = serializeDateWithLogicalType(expectedDate, expectedTimestamp);
    Pair<LocalDate, Instant> deserialized = deserializeDateWithLogicalType(serialized);

    assertEquals(expectedDate, deserialized.getLeft());

    assertEquals(expectedTimestamp.toEpochMilli(), deserialized.getRight().toEpochMilli(),
            "Timestamps should match exactly at millisecond precision");
}

As we can see from the test, the timestamp-millis logical type maintains precision, and the timestamps match as expected. Furthermore, using logical types makes our data format explicit in the schema definition, which is valuable for schema development and documentation.

3.2. Using Avro’s Conversion API

Avro provides a conversion API that can handle logical types automatically. This API isn’t a separate approach. In fact, it’s built on top of logical types and helps speed up the conversion process.

As such, it saves us from manually converting between Java types and Avro’s internal representation. Furthermore, it adds type safety to the conversion process.

Now, let’s implement the solution that handles logical types automatically:

public static byte[] serializeWithConversionApi(LocalDate date, Instant timestamp) {
    Schema schema = createDateSchema();
    GenericRecord record = new GenericData.Record(schema);
    
    Conversion<LocalDate> dateConversion = new org.apache.avro.data.TimeConversions.DateConversion();
    LogicalTypes.date().addToSchema(schema.getField("date").schema());
    
    Conversion<Instant> timestampConversion = 
                                new org.apache.avro.data.TimeConversions.TimestampMillisConversion();
    LogicalTypes.timestampMillis().addToSchema(schema.getField("timestamp").schema());
    
    record.put("date", dateConversion.toInt(date, 
                                            schema.getField("date").schema(), 
                                            LogicalTypes.date()));
    record.put("timestamp", 
                timestampConversion.toLong(timestamp, schema.getField("timestamp").schema(), 
                LogicalTypes.timestampMillis()));
    
    ByteArrayOutputStream baos = new ByteArrayOutputStream();
    DatumWriter<GenericRecord> datumWriter = new GenericDatumWriter<>(schema);
    Encoder encoder = EncoderFactory.get().binaryEncoder(baos, null);
    
    datumWriter.write(record, encoder);
    encoder.flush();
    
    return baos.toByteArray();
}

Differently from the previous approach, this time we use LogicalTypes.date() and LogicalTypes.timestampMillis() for conversion.

Next, let’s implement the method that handles the deserialization:

public static Pair<LocalDate, Instant> deserializeWithConversionApi(byte[] bytes) {
    Schema schema = createDateSchema();
    DatumReader<GenericRecord> datumReader = new GenericDatumReader<>(schema);
    Decoder decoder = DecoderFactory.get().binaryDecoder(bytes, null);
    
    GenericRecord record = datumReader.read(null, decoder);
    
    Conversion<LocalDate> dateConversion = new DateConversion();
    LogicalTypes.date().addToSchema(schema.getField("date").schema());
    
    Conversion<Instant> timestampConversion = new TimestampMillisConversion();
    LogicalTypes.timestampMillis().addToSchema(schema.getField("timestamp").schema());
    
    int daysSinceEpoch = (int) record.get("date");
    long millisSinceEpoch = (long) record.get("timestamp");
    
    LocalDate date = dateConversion.fromInt(
        daysSinceEpoch, 
        schema.getField("date").schema(), 
        LogicalTypes.date()
    );
    
    Instant timestamp = timestampConversion.fromLong(
        millisSinceEpoch, 
        schema.getField("timestamp").schema(), 
        LogicalTypes.timestampMillis()
    );
    
    return Pair.of(date, timestamp);
}

Finally, let’s verify the implementation:

@Test
void whenSerializingWithConversionApi_thenDeserializesCorrectly() {

    LocalDate expectedDate = LocalDate.now();
    Instant expectedTimestamp = Instant.now();

    byte[] serialized = serializeWithConversionApi(expectedDate, expectedTimestamp);
    Pair<LocalDate, Instant> deserialized = deserializeWithConversionApi(serialized);

    assertEquals(expectedDate, deserialized.getLeft());
    assertEquals(expectedTimestamp.toEpochMilli(), deserialized.getRight().toEpochMilli(),
            "Timestamps should match at millisecond precision");
}

4. Handling Legacy Code That Uses Date

Currently, many existing Java applications still use the legacy java.util.Date class. For such codebases, we’ll need a strategy to handle these objects when serializing with Avro.

A good approach is to convert legacy dates to the modern Java time API before we serialize the information:

public static byte[] serializeLegacyDateAsModern(Date legacyDate) {
    Instant instant = legacyDate.toInstant();
    
    LocalDate localDate = instant.atZone(ZoneId.systemDefault()).toLocalDate();
    
    return serializeDateWithLogicalType(localDate, instant);
}

Then, we can serialize the date using one of the previous methods. This approach allows us to take advantage of Avro’s logical types while still working with legacy Date objects.

Let’s also test our implementation:

@Test
void whenSerializingLegacyDate_thenConvertsCorrectly() {

    Date legacyDate = new Date();
    LocalDate expectedLocalDate = legacyDate.toInstant()
      .atZone(ZoneId.systemDefault())
      .toLocalDate();

    byte[] serialized = serializeLegacyDateAsModern(legacyDate);
    LocalDate deserialized = deserializeDateWithLogicalType(serialized).getKey();
    
    assertEquals(expectedLocalDate, deserialized);
}

5. Conclusion

In this article, we’ve explored different ways to serialize Date objects using Avro. We’ve learned how to use Avro’s logical types to properly represent date and timestamp values.

For most modern applications, using Avro’s Conversion API to handle its logical types with java.time classes provide the best approach. Through this combination, we obtain type safety, maintain proper semantics, and compatibility with Avro’s schema expansion capabilities.

The code backing this article is available on GitHub. Once you're logged in as a Baeldung Pro Member, start learning and coding on the project.
Course – Black Friday 2025 – NPI EA (cat= Baeldung)
announcement - icon

Yes, we're now running our Black Friday Sale. All Access and Pro are 33% off until 2nd December, 2025:

>> EXPLORE ACCESS NOW

Partner – Orkes – NPI EA (cat = Spring)
announcement - icon

Modern software architecture is often broken. Slow delivery leads to missed opportunities, innovation is stalled due to architectural complexities, and engineering resources are exceedingly expensive.

Orkes is the leading workflow orchestration platform built to enable teams to transform the way they develop, connect, and deploy applications, microservices, AI agents, and more.

With Orkes Conductor managed through Orkes Cloud, developers can focus on building mission critical applications without worrying about infrastructure maintenance to meet goals and, simply put, taking new products live faster and reducing total cost of ownership.

Try a 14-Day Free Trial of Orkes Conductor today.

Partner – Orkes – NPI EA (tag = Microservices)
announcement - icon

Modern software architecture is often broken. Slow delivery leads to missed opportunities, innovation is stalled due to architectural complexities, and engineering resources are exceedingly expensive.

Orkes is the leading workflow orchestration platform built to enable teams to transform the way they develop, connect, and deploy applications, microservices, AI agents, and more.

With Orkes Conductor managed through Orkes Cloud, developers can focus on building mission critical applications without worrying about infrastructure maintenance to meet goals and, simply put, taking new products live faster and reducing total cost of ownership.

Try a 14-Day Free Trial of Orkes Conductor today.

eBook – HTTP Client – NPI EA (cat=HTTP Client-Side)
announcement - icon

The Apache HTTP Client is a very robust library, suitable for both simple and advanced use cases when testing HTTP endpoints. Check out our guide covering basic request and response handling, as well as security, cookies, timeouts, and more:

>> Download the eBook

eBook – Java Concurrency – NPI EA (cat=Java Concurrency)
announcement - icon

Handling concurrency in an application can be a tricky process with many potential pitfalls. A solid grasp of the fundamentals will go a long way to help minimize these issues.

Get started with understanding multi-threaded applications with our Java Concurrency guide:

>> Download the eBook

eBook – Java Streams – NPI EA (cat=Java Streams)
announcement - icon

Since its introduction in Java 8, the Stream API has become a staple of Java development. The basic operations like iterating, filtering, mapping sequences of elements are deceptively simple to use.

But these can also be overused and fall into some common pitfalls.

To get a better understanding on how Streams work and how to combine them with other language features, check out our guide to Java Streams:

>> Join Pro and download the eBook

eBook – Persistence – NPI EA (cat=Persistence)
announcement - icon

Working on getting your persistence layer right with Spring?

Explore the eBook

Course – LS – NPI EA (cat=REST)

announcement - icon

Get started with Spring Boot and with core Spring, through the Learn Spring course:

>> CHECK OUT THE COURSE

Partner – Moderne – NPI EA (tag=Refactoring)
announcement - icon

Modern Java teams move fast — but codebases don’t always keep up. Frameworks change, dependencies drift, and tech debt builds until it starts to drag on delivery. OpenRewrite was built to fix that: an open-source refactoring engine that automates repetitive code changes while keeping developer intent intact.

The monthly training series, led by the creators and maintainers of OpenRewrite at Moderne, walks through real-world migrations and modernization patterns. Whether you’re new to recipes or ready to write your own, you’ll learn practical ways to refactor safely and at scale.

If you’ve ever wished refactoring felt as natural — and as fast — as writing code, this is a good place to start.

Course – Black Friday 2025 – NPI (All)
announcement - icon

Yes, we're now running our Black Friday Sale. All Access and Pro are 33% off until 2nd December, 2025:

>> EXPLORE ACCESS NOW

eBook Jackson – NPI EA – 3 (cat = Jackson)