1. Overview

This tutorial will continue to explore some of the core features of Spring Data MongoDB – the @DBRef annotation and life-cycle events.

2. @DBRef

The mapping framework doesn’t support storing parent-child relations and embedded documents within other documents. What we can do though is – we can store them separately and use a DBRef to refer to the documents.

When the object is loaded from MongoDB, those references will be eagerly resolved, and we’ll get back a mapped object that looks the same as if it had been stored embedded within our master document.

Let’s look at some code:

@DBRef
private EmailAddress emailAddress;

EmailAddress looks like:

@Document
public class EmailAddress {
    @Id
    private String id;
    
    private String value;
    
    // standard getters and setters
}

Note that the mapping framework doesn’t handle cascading operations. So – for instance – if we trigger a save on a parent, the child won’t be saved automatically – we’ll need to explicitly trigger the save on the child if we want to save it as well.

This is precisely where life cycle events come in handy.

3. Lifecycle Events

Spring Data MongoDB publishes some very useful life cycle events – such as onBeforeConvert, onBeforeSave, onAfterSave, onAfterLoad and onAfterConvert.

To intercept one of the events, we need to register a subclass of AbstractMappingEventListener and override one of the methods here. When the event is dispatched, our listener will be called and domain object passed in.

3.1. Basic Cascade Save

Let’s look at the example we had earlier – saving the user with the emailAddress. We can now listen to the onBeforeConvert event which will be called before a domain object goes into the converter:

public class UserCascadeSaveMongoEventListener extends AbstractMongoEventListener<Object> {
    @Autowired
    private MongoOperations mongoOperations;

    @Override
    public void onBeforeConvert(BeforeConvertEvent<Object> event) { 
        Object source = event.getSource(); 
        if ((source instanceof User) && (((User) source).getEmailAddress() != null)) { 
            mongoOperations.save(((User) source).getEmailAddress());
        }
    }
}

Now we just need to register the listener into MongoConfig:

@Bean
public UserCascadeSaveMongoEventListener userCascadingMongoEventListener() {
    return new UserCascadeSaveMongoEventListener();
}

Or as XML:

<bean class="org.baeldung.event.UserCascadeSaveMongoEventListener" />

And we have cascading semantics all done – albeit only for the user.

3.2. A Generic Cascade Implementation

Let’s now improve the previous solution by making the cascade functionality generic. Let’s start by defining a custom annotation:

@Retention(RetentionPolicy.RUNTIME)
@Target(ElementType.FIELD)
public @interface CascadeSave {
    //
}

Let’s now work on our custom listener to handle these fields generically and not have to cast to any particular entity:

public class CascadeSaveMongoEventListener extends AbstractMongoEventListener<Object> {

    @Autowired
    private MongoOperations mongoOperations;

    @Override
    public void onBeforeConvert(BeforeConvertEvent<Object> event) { 
        Object source = event.getSource(); 
        ReflectionUtils.doWithFields(source.getClass(), 
          new CascadeCallback(source, mongoOperations));
    }
}

So we’re using the reflection utility out of Spring, and we’re running our callback on all fields that meet our criteria:

@Override
public void doWith(Field field) throws IllegalArgumentException, IllegalAccessException {
    ReflectionUtils.makeAccessible(field);

    if (field.isAnnotationPresent(DBRef.class) && 
      field.isAnnotationPresent(CascadeSave.class)) {
    
        Object fieldValue = field.get(getSource());
        if (fieldValue != null) {
            FieldCallback callback = new FieldCallback();
            ReflectionUtils.doWithFields(fieldValue.getClass(), callback);

            getMongoOperations().save(fieldValue);
        }
    }
}

As you can see, we’re looking for fields that have both the DBRef annotation as well as CascadeSave. Once we find these fields, we save the child entity.

Let’s look at the FieldCallback class which we’re using to check if the child has a @Id annotation:

public class FieldCallback implements ReflectionUtils.FieldCallback {
    private boolean idFound;

    public void doWith(Field field) throws IllegalArgumentException, IllegalAccessException {
        ReflectionUtils.makeAccessible(field);

        if (field.isAnnotationPresent(Id.class)) {
            idFound = true;
        }
    }

    public boolean isIdFound() {
        return idFound;
    }
}

Finally, to make it all work together, we, of course, need to emailAddress field to now be correctly annotated:

@DBRef
@CascadeSave
private EmailAddress emailAddress;

3.3. The Cascade Test

Let’s now have a look at a scenario – we save a User with emailAddress, and the save operation cascades to this embedded entity automatically:

User user = new User();
user.setName("Brendan");
EmailAddress emailAddress = new EmailAddress();
emailAddress.setValue("[email protected]");
user.setEmailAddress(emailAddress);
mongoTemplate.insert(user);

Let’s check our database:

{
    "_id" : ObjectId("55cee9cc0badb9271768c8b9"),
    "name" : "Brendan",
    "age" : null,
    "email" : {
        "value" : "[email protected]"
    }
}

4. Conclusion

In this article, we illustrated some cool features of Spring Data MongoDB – the @DBRef annotation, life cycle events and how we can handle cascading intelligently.

The implementation of all these examples and code snippets can be found over on GitHub – this is a Maven based project, so it should be easy to import and run as it is.

Course – LSD (cat=Persistence)

Get started with Spring Data JPA through the reference Learn Spring Data JPA course:

>> CHECK OUT THE COURSE
res – Persistence (eBook) (cat=Persistence)
Comments are closed on this article!