1. Overview

Caching can be very important to high-performance applications. Consequently, a lot of caching libraries have been developed, making the selection of one caching library over another one a difficult task. Additionally, switching from one cache library to another may involve a large amount of refactoring. ScalaCache is a possible solution to these problems.

ScalaCache is a facade for caching libraries. It simplifies cache handling and also provides a standardized API for easy usage.

In this tutorial, we’ll look at ScalaCache and how it simplifies the caching operations.

2. Intro to ScalaCache

ScalaCache supports a wide variety of caching libraries. A few such libraries are Redis, Memcached, Guava Cache, Caffeine, and EhCache. We can interchangeably use any of these caching libraries easily in ScalaCache, with minimal refactoring.

Using ScalaCache has many advantages:

  • Standardized APIs for any caching libraries
  • Ability to easily switch between different caching libraries without much refactoring
  • Simple and idiomatic Scala APIs
  • Synchronous and asynchronous modes

3. Setup

3.1. SBT Dependencies

Let’s start by adding the ScalaCache SBT dependency:

"com.github.cb372" %% "scalacache-core" % "0.28.0"

Next, we’ll need to add the dependency for the required caching library wrapper. For our examples, we’ll be using GuavaCache:

"com.github.cb372" %% "scalacache-guava" % "0.28.0"

3.2. Configurations

To start using ScalaCache, we’ll instantiate the relevant cache instance. Let’s assume that we are using GuavaCache for caching User information:

case class User(id: Long, name: String)
val underlyingGuavaCache = CacheBuilder.newBuilder().maximumSize(10000L).build[String, Entry[User]]

Next, we’ll need to instantiate ScalaCache using the Guava cache instance:

implicit val scalaCacheGuava: Cache[User] = GuavaCache(underlyingGuavaCache)

Note that scalaCacheGuava is declared as an implicit variable allowing us to use it whenever it is brought into scope.

4. Caching Operations

ScalaCache supports three types of caching:

  • Manual Caching: Caching operations by setting and removing the cache manually
  • Memoization: Simple way to automatically cache the results of a method using the memoize, memoizeSync, and memoizeF methods
  • Caching Block: Similar to memorization, but it can be done on a code block level instead of a method level using the caching and cachingF methods

The following sections provide more details about these different types.

5. Modes

ScalaCache supports both synchronous and asynchronous modes of caching. It is also possible to wrap the results in popular effect types like Cats IO, Monix Task, Scalaz Task, and Try.

5.1. Sync

Sync mode provides the easiest way to cache data. This operation blocks the current thread until the caching is completed and returns the result. We can use this mode if we are using in-memory caching. To use this, we need to import:

import scalacache.modes.sync._

5.2. Try

In Try mode, the caching results are wrapped in scala.util.Try. If any failures occur during caching operations, ScalaCache will wrap the result in Failure:

import scalacache.modes.try_._
def getUserTry(userId: Long): Try[User] =
  memoize[Try, User](None) {
    User(userId, "try-user")
  }
}

Note that there is an extra underscore after try in the import statement.

5.3. Future

ScalaCache also supports Future based async APIs for caching. The caching operations will be done in a separate thread and return a Future result. This needs an ExecutionContext in scope:

import scalacache.modes.scalaFuture._
def getUserFuture(userId: Long): Future[User] = {
  memoize[Future, User](Some(10.seconds)) {
    User(userId, "future")
  }
}

5.4. Effect Type APIs

As mentioned earlier, ScalaCache also supports popular effect-type libraries. These are not part of the core library, and specific dependencies need to be added to use these Effect-Type-based caching. For instance, to use CatsIO, we need to add the dependency:

"com.github.cb372" %% "scalacache-cats-effect" % "0.28.0"

After that, we can cache our data with CatsIO by using the relevant mode:

implicit val mode: Mode[IO] = scalacache.CatsEffect.modes.async
def getUserCatsIO(id: Long): IO[User] = {
  memoize[IO, User](None) {
    User(id, "io-user")
  }
}

6. Manual Cache Operations

Like any other caching library, we can use the ScalaCache APIs to get, put and remove data from the underlying cache:

def getUser(userId: Long): User = {
  val cacheResult = get(buildUserKey(userId))
  if (cacheResult.isEmpty) {
    val fromDB = queryUserFromDB(userId)
    put(buildUserKey(userId))(fromDB)
    fromDB
  } else cacheResult.get
}

The above code tries to get the user details from the cache first. If it is not available, it reads from the database and sets the value to cache before returning it.

7. Memoization

ScalaCache provides a very easy way to manage cache for method results. Memoization caches the results of a method, and when the same method is invoked with the same arguments, it will use the cached result instead of executing the method. The method name and argument values are used to build the cache key.

7.1. Synchronous Memoization

We can use the method memoizeSync() to memoize the results of synchronous methods. For this to work, we need to have the ScalaCache implicit in scope and also import the sync methods:

package com.baeldung.cache.service
import scalacache.modes.sync._
class SyncQueryMemoizeService {
  def getUser(userId: Long): User ={
    memoizeSync(Some(10.seconds)) {
      queryUserFromDB(userId)
    }
  }
}

We can also provide a TTL for the memoized method. The results will be cached for the mentioned duration, and the cache will be cleared once the duration is crossed. If we don’t provide any TTL value, then ScalaCache will cache the results forever.

The key for the above method will be generated automatically. The default generator creates the cache key using the full classpath, method name, and parameter values.

7.2. Memoization with Effect

If the method returns any effect containers like Future or CatsIO, then we’ll use the memoizeF() method. We need to provide two type parameters for the memoizeF(). The first one is the container type, and the second is the actual result type.

Let’s say we’re catching a method that is returning a Future:

def getUser(userId: Long): Future[User] =
  memoizeF[Future, User](Some(10.seconds)) {
    queryUserFromDB(userId)
  }

ScalaCache will not cache the data if the result is a failed Future. For the next invocation, it will still execute the method and cache it if Future is successful.

Instead of memoizeF, we can use also memoize method. In that case, it will use the provided ExecutionContext and execute the code in another thread.

8. Caching Block

We can also enable caching for a particular block of code. It works exactly like memoization. We can enable the caching block using the caching() method:

def getUser(id: Long) = {
  caching("id", id)(None) {
    queryResult(id)
  }
}

If we invoke the above method with the value 22, then ScalaCache generates the cache key automatically as id:22. We can give any number of keys to the caching method for key generation:

caching("id", id, "cache", "key")(None) {
  queryResult(id)
}

This will generate a cache key for the value 22 as id:22:cache:key.

Similar to memoizeF, we can also use cachingF to cache a result with a type container:

def getUserFuture(id: Long) = {
  cachingF("keyF", id)(None) {
    queryResultFuture(id)
  }
}

9. Cache Key Customization

We can customize the cache key for memoization operations. By default, ScalaCache uses the excludeClassConstructorParams method to generate the cache key. Instead of that, we can provide our custom implementation and set the generator in the CacheConfig.

To write a custom generator, we need to extend the trait MethodCallToStringConverter and override the toString() method:

object CustomKeyGenerator extends MethodCallToStringConverter {
  override def toString(
    fullClassName: String,
    constructorParamss: IndexedSeq[IndexedSeq[Any]],
    methodName: String,
    paramss: IndexedSeq[IndexedSeq[Any]]
  ): String = {
    val keyPart = paramss.map { methParams =>
      methParams.map(_.toString).mkString("_")
    }.mkString("-")
    methodName + "#" + keyPart
  }
}

Next, we’ll set this custom cache generator in the cache config:

val cache = CacheBuilder.newBuilder().maximumSize(10000L).build[String, Entry[User]]
implicit val customKeyCacheConfig = CacheConfig(memoization =
  MemoizationConfig(toStringConverter = CustomKeyGenerator)
)
implicit val guavaCache: Cache[User] = GuavaCache(cache)

Note that the customKeyCacheConfig is provided as an implicit parameter. It is used while creating the GuavaCache instance. If we don’t provide this implicit instance, scalaCache will use the default key generation.

10. Switching to Another Cache

If we need to use another cache other than Guava, it is very easy to do. We need to instantiate the required caching library and provide it to ScalaCache. We don’t need to change much in the usage. The only change required is to provide the relevant import statements for the underlying cache. We can create a generic cache service class:

class GenericCacheService(implicit val cache: Cache[User]) {
  def getUser(userId: Long): User =
    memoizeSync(Some(10.seconds)) {
      //DB Querying logic goes here
      queryUserFromDB(userId)
    }
  }
}

Now, we can create the instance of GenericCacheService by having the required cache implicit in scope. To use Guava cache:

import com.baeldung.cache.service.GuavaCacheMemoizationConfig.guavaCache
val guavaCacheService = new GenericCacheService()

To switch from Guava to Caffeine, we need to add SBT dependency for the ScalaCache-Caffeine. Then we’ll create a CaffeineCache config and provide it as an implicit value:

import com.baeldung.cache.service.CaffeineCacheConfig.caffeineCache
val caffeineCacheService = new GenericCacheService()

11. Conclusion

In this article, we looked at ScalaCache and how we can use it to easily handle the caching requirements. We have also seen different types of caching operations using ScalaCache and ways to use other cache libraries.

As always, the code samples used are available over on GitHub.

1 Comment
Oldest
Newest
Inline Feedbacks
View all comments
Comments are open for 30 days after publishing a post. For any issues past this date, use the Contact form on the site.