<img height="1" width="1" src="https://www.facebook.com/tr?id=1076094119157733&amp;ev=PageView &amp;noscript=1">

Scalameta tutorial: Cache decorator

Posted by Qing Wei on Tue, Sep 5, 2017


This is a tutorial to show how to use Scalameta to develop a generic, parameterized annotation. To know how to setup a project to use scalameta, refer to official docs

What is scalameta?

Scala-meta is the de-facto toolkit for metaprogramming in Scala. For those who are new to metaprogramming, it means programming against code/syntax/AST

Metaprogramming is very useful when you notice a repeating pattern in your code, but you are not able to refactor it due to limitations of the programming language.

Conceptually, scalameta allows you to access your code as data (Abstract Syntax Tree), and manipulate it at compile time.

Problem: Caching

Caching is a common technique that almost all programmers are familiar with. In this tutorial, we will develop a cache macro that

  • has low syntatic overhead, ie. it should not change the cached function much
  • able to support different cache storage, eg. In memory cache, elastic search etc
  • able to cache methods with multiple arguments

Naive implementation

Let's start with a simple implementation without using macro

Code to support cache function

Calling cache function


Pros - it's simple and easy to understand - it supports different cache storage - able to work on methods of different arguments

Cons - It is a bit intrusive, notice the implementation of cachedFib needs to be changed - It is awkward to work with functions with multiple arguments, because our CacheBackend can only takes in K, V as type param, so if you have a function with signature def fn(x: Int, y: Int): Int , you need to combine x and y into (Int, Int) so that it fit into CacheBackend's type signature

Let's see how we can improve it using scala meta.

Scalameta Implementation 1

Here, we are going to implement cache function as a macro, the end goal is to support syntax like this

Before we jump into implementation, we can observe a few difference with previous implementation


  • The syntax is cleaner, it does not change the method's definition at all
  • It also supports different cache storage
  • Too much MAGIC, how does it even work?
  • Does it support function with multiple arguments?

Let's answer the 1st question, how does it works?

Below is the implementation of the cache macro, let's go through the comments to understand what it does

So here is quite some amount of info, especially around quasiquote. You might have a few questions, like what is the type signature of tpr that we've captured? I will go through them in next section, but here I wish you get familiar with the general flow, basically we are trying to

  1. use pattern matching to capture relevant information from the AST [Compile Time]
  2. perform transformation on the AST [Compile Time]
  3. transformed AST will then get compiled into artifact that is invoked at runtime

if you're interested to know more about quasiquote, here is the reference for all quasiquote syntax

Now let's inspect the implementation of AST transformation logic, ie. CacheMacroImpl.expand(tpr, backendParam, defn)

I hope the implementation is not too intimidating, it does the following

  1. Check if the annotation methods is allowed or not (curried method is not allowed)
  2. Check the number of arguments
    • If 1, use it as is
    • If multiple, convert them into a tuple, and use the tuple as key of cache
  3. Try to get data from cache using cache.get(key)
    • If cache hit, return the cached value
    • If cache miss, evaluate the original annotated method, cache the result using cache.put(k, v), and return the result


Pros - it supports different cache storage, eg. you could implement a Cache Backend that support TTL - it has almost zero syntatic overhead on caller - able to work on methods of different arguments

Cons - It is more complicated to implement - The implementation is harder to debug


Here we end this tutorial, as we shown how could you create a generic, parameterize macro using scalameta. The code is availble here. As an exercise for readers, you can try to improve the cache so that it support async get and put.

Note: I am not claiming cache annotation is a good use-case of macro, ultimately it depends on your team and problem on hand. Nonetheless, I believe everyone should learn a bit of it to enhance your skills, and also to have better understanding on how compiler views your code.

Recent Posts

Posts by Topic

see all

Subscribe to Email Updates