Fork me on GitHub


Scala library for boilerplate-free data transformations

What Chimney does

In the daily life of a strongly-typed language's programmer sometimes it happens we need to transform an object of one type to another object which contains a number of the same or similar fields in their definitions.

case class MakeCoffee(id: Int, kind: String, addict: String)
case class CoffeeMade(id: Int, kind: String, forAddict: String, at: ZonedDateTime)

Usual approach is to just rewrite fields one by one

val command = MakeCoffee(id = Random.nextInt,
                         kind = "Espresso",
                         addict = "Piotr")
val event = CoffeeMade(id =,
                       kind = command.kind,
                       forAddict = command.addict,
                       at =

While the example stays lean, in real-life code we usually end up with tons of such boilerplate, especially when:

Chimney provides a compact DSL with which you can define transformation rules and transform your objects with as little boilerplate as possible.

import io.scalaland.chimney.dsl._

val event = command.into[CoffeeMade]
  .withFieldComputed(, _ =>
  .withFieldRenamed(_.addict, _.forAddict)

Underneath it uses Scala macros to give you:

Getting started

To include Chimney to your SBT project, add the following line to your build.sbt:

libraryDependencies += "io.scalaland" %% "chimney" % "0.4.1"

Library is released for Scala 2.11.x, 2.12.x and 2.13.x. If you want to use it with Scala.js(or Scala Native), you need to replace %% with %%%. Due to some compiler bugs, it's recommended to use at least Scala 2.11.9 or 2.12.1.

Trying with Ammonite REPL

The quickest way to try out Chimney is to use a script that downloads coursier and uses it to fetch Ammonite REPL with the latest version of Chimney. It drops you immediately into a REPL session.
curl -s | bash
Welcome to the Ammonite Repl 1.1.0
(Scala 2.12.4 Java 1.8.0_152)
If you like Ammonite, please support our development at
@ case class Foo(x: String, y: Int)
defined class Foo

@ case class Bar(x: String, y: Int, z: Boolean = true)
defined class Bar

@ Foo("abc", 10).transformInto[Bar]
res2: Bar = Bar("abc", 10, true)


In this section you will learn how to use Chimney example by example.

Basic transformations

When target object contains only fields present in the source object, with corresponding types, we can use shorthanded transformInto.

case class Catterpillar(size: Int, name: String)
case class Butterfly(size: Int, name: String)

val stevie = Catterpillar(5, "Steve")
val steve = stevie.transformInto[Butterfly]
// Butterfly(5, "Steve")

Nested transformations

It also works when transformation needs to be recursive, possibly involving traversal on nested collection.

case class Youngs(insects: List[Catterpillar])
case class Adults(insects: List[Butterfly])

val kindergarden = Youngs(List(Catterpillar(5, "Steve"), Catterpillar(4, "Joe")))
val highschool = kindergarden.transformInto[Adults]
// Adults(List(Butterfly(5, "Steve"), Butterfly(4, "Joe"))

We can use it as long as Chimney can recursively construct transformation for all fields of a target object. In this example transformer for List type is constructed basing on automatically derived Catterpillar ~> Butterfly mapping.

Providing missing values

Let's add a field to our Butterfly case class.

case class Butterfly(size: Int, name: String, wingsColor: String)

Now, when trying to perform the same transformation, we get compile-time error. This is naturally expected, as we don't have any data source for new wingsColor field.

val stevie = Catterpillar(5, "Steve")
val steve = stevie.transformInto[Butterfly]
// error: Chimney can't derive transformation from Catterpillar to Butterfly
// Butterfly
//   wingsColor: String - no accessor named wingsColor in source type Catterpillar
// Consult for usage examples.
//        val steve = stevie.transformInto[Butterfly]
//                                        ^

In this scenario, we can use Chimney's syntax to provide a missing value. Notice that transformInto[T] is a shortcut for into[T].transform, where the latter form allow us to provide additional transformation rules.

val steve = stevie.into[Butterfly]
  .withFieldConst(_.wingsColor, "white")
// Butterfly(5, "Steve", "white")

We can also construct a value dynamically, by providing a function.

val steve = stevie.into[Butterfly]
  .withFieldComputed(_.wingsColor, c => if(c.size > 4) "yellow" else "gray")
// Butterfly(5, "Steve", "yellow")

Default values

Chimney also respects case classes' default values as a possible target field value source. When we want to rely on defaults, we don't need to provide values manually.

case class Butterfly(size: Int, name: String, wingsColor: String = "purple")

val steve = stevie.transformInto[Butterfly]
// Butterfly(5, "Steve", "purple")

Providing the value anyway for such case would just ignore the default from case class.

Disabling default values

It is possible to disable lookup for default values and require them to be passed explicitly, using .disableDefaultValues operation.

val steve = stevie
// error: Chimney can't derive transformation from Catterpillar to Butterfly
// Butterfly
//   wingsColor: String - no field named wingsColor in source type Catterpillar
// Consult for usage examples.
//            .transform
//            ^

Standard types

Chimney supports deriving transformers for many standard Scala types, like Unit, Options, Eithers, collection types including Lists, Vectors, Sets, Maps, Arrays and many more.

If you are interested to see how they are handled, it's recommended to explore the test suite.

Value classes

As nowadays value classes tends to be relatively widely pervasive, Chimney handles them in a special way, supporting automatic value class field extraction and wrapping.

object rich {
  case class PersonId(id: Int) extends AnyVal
  case class PersonName(name: String) extends AnyVal
  case class Person(personId: PersonId, personName: PersonName, age: Int)
object plain {
  case class Person(personId: Int, personName: String, age: Int)

val richPerson = rich.Person(PersonId(10), PersonName("Bill"), 30)
val plainPerson = richPerson.transformInto[plain.Person]
// plain.Person(10, "Bill", 30)
val richPerson2 = plainPerson.transformInto[rich.Person]
// rich.Person(PersonId(10), PersonName("Bill"), 30)

Field re-labelling

Sometimes a field only change its name. In such case you can use withFieldRenamed operation to instruct the library about performed renaming.

case class SpyGB(name: String, surname: String)
case class SpyRU(imya: String, familia: String)

val jamesGB = SpyGB("James", "Bond")

val jamesRU = jamesGB.into[SpyRU]
    .withFieldRenamed(, _.imya)
    .withFieldRenamed(_.surname, _.familia)
// SpyRU("James", "Bond")

Default option values

In case you have added an optional field to a type, wanting to write migration from old data, usually you set new optional type to None.

case class Foo(a: Int, b: String)
case class FooV2(a: Int, b: String, newField: Option[Double])

Usual approach would be to use .withFieldConst to set new field value.

Foo(5, "test")
  .withFieldConst(_.newField, None)
// FooV2(5, "test", None)

At some scale this may turn out to be cumbersome. Therefore, it's possible to handle such Option field values for which we can't find counterpart in data source as None by default. You just need to enable this behavior by using .enableOptionDefaultsToNone.

Foo(5, "test")
// FooV2(5, "test", None)

Default Unit value

In case you have added an Unit field to a type, wanting to write migration from old data, you want to set new type to ()).

case class Foo(a: Int, b: String)
case class Bar(a: Int, b: String, newField: Unit)

Foo(5, "test")
// Bar(5, "test", ())

Advanced techniques

Custom transformations

In case the transformation is relatively complex or if for some reason you just want to bypass Chimney derivation mechanism, you can always fall back to a simple function that you can plug into the Chimney transformation.

The library defines a type class Transformer:

trait Transformer[From, To] {
  def transform(src: From): To

You can plug your own transformer in by providing implicit instance in a local context.

import io.scalaland.chimney.dsl._
import io.scalaland.chimney.Transformer

object v1 {
  case class User(id: Int, name: String, street: String, postalCode: String)
object v2 {
  case class Address(street: String, postalCode: String)
  case class User(id: Int, name: String, addresses: List[Address])

implicit val userV1toV2: Transformer[v1.User, v2.User] =
  (user: v1.User) => v2.User(
    id =,
    name =,
    addresses = List(v2.Address(user.street, user.postalCode))

val v1Users = List(
  v1.User(1, "Steve", "Love street", "27000"),
  v1.User(2, "Anna", "Broadway", "00321")

val v2Users = v1Users.transformInto[List[v2.User]]
// List(
//   v2.User(1, "Steve", List(Address("Love street", "27000"))),
//   v2.User(2, "Anna", List(Address("Broadway", "00321")))
// )
As we can see, Chimney correctly picked and used implicit Transformer[v1.User, v2.User] defined locally in transformation between list of users. But is it really a necessity to define custom transformer completely manually?

Transformer definition DSL

One can think that if we only need to provide function implementation of type v1.User => v2.User, why not use Chimney's DSL in order to generate the transformation?

implicit val userV2toV2: Transformer[v1.User, v2.User] =
  (user: v1.User) => user
    .withFieldComputed(_.addresses, u => List(v2.Address(u.street, u.postalCode)))

While it looks reasonably, it will not work as expected :(

Chimney's macro, before trying to derive any transformer, tries to find instance of required transformer in implicit scope. Unfortunately, it will pick userV2toV2, because types match, this value is marked as implicit and is available in macro expansion scope. Depending on few details, it will either end up as compilation error, or will lead to StackOverflowError at runtime.

Since version 0.4.0 there is a simple solution to this problem.

implicit val userV2toV2: Transformer[v1.User, v2.User] =
  Transformer.define[v1.User, v2.User]
    .withFieldComputed(_.addresses, u => List(v2.Address(u.street, u.postalCode)))

We need to use special syntax Transformer.define[From, To] which introduces us to new transformer builder between types From and To. In transformer builder we can use all operations available to usual transformer DSL. The only difference is that we don't call .transform at the end (since we don't transform value in place), but buildTransformer (because we generate transformer here). Such generated transformer is semantically equivalent to hand-written transformer from previous section.

Chimney solves self reference implicit problem by not looking for implicit instance for Transformer[From, To] when using transformer builder Transformer.define[From, To].

Recursive data types

Chimney can generate transformers between recursive data structures. Consider following example.

case class Foo(x: Option[Foo])
case class Bar(x: Option[Bar])

We would like to define transformer instance which would be able to convert a value Foo(Some(Foo(None))) to Bar(Some(Bar(None))). In order to avoid aforementioned issues with self-referencing, you must define your recursive transformer instance as implicit def or implicit lazy val.

implicit def fooToBarTransformer: Transformer[Foo, Bar] =
  Transformer.derive[Foo, Bar] // or Transformer.define[Foo, Bar].buildTransformer

// Bar(Some(Bar(None)))

Coproducts support

With Chimney you can not only transform case classes, but sealed trait hierarchies (also known as coproducts) as well. Consider two following hierarchy definitions.

sealed trait Color
object Color {
  case object Red extends Color
  case object Green extends Color
  case object Blue extends Color

sealed trait Channel
object Channel {
  case object Alpha extends Channel
  case object Blue extends Channel
  case object Green extends Channel
  case object Red extends Channel

Because of object names correspondence, we can transform Color to a Channel in a simple way.

val colRed: Color = Color.Red
val chanRed = colRed.transformInto[Channel]
// chanRed: Channel = Red

How about other way round?

// error: Chimney can't derive transformation from Channel to Color
// Color
//   can't transform coproduct instance Channel.Alpha to Color
// Consult for usage examples.
//        chanRed.transformInto[Color]
//                             ^

This time we tried to transform a Channel to a Color. Notice that in this case we don't have defined case object in target hierarchy with corresponding name for case object Alpha. Wanting to keep the transformation total, we need to somehow provide a value from a target domain. We can use withCoproductInstance to do that. Let's convert any Channel.Alpha to Color.Blue.

val red = chanRed.into[Color]
  .withCoproductInstance { (_: Channel.Alpha.type) => Color.Blue }
// red: Color = Red

val alpha: Channel = Channel.Alpha
val blue = alpha.into[Color]
  .withCoproductInstance { (_: Channel.Alpha.type) => Color.Blue }
// blue: Color = Blue

After providing a default, Chimney can prove the transformation is total and use provided function, when it's needed.


Chimney also supports case class patching. It is a bit different type of transformation when you hold an object of some type, but want to modify only subset of fields. Consider following example:

case class Email(address: String) extends AnyVal
case class Phone(number: Long) extends AnyVal

case class User(id: Int, email: Email, phone: Phone)
case class UserUpdateForm(email: String, phone: Long)

Let's assume you want to apply update form to existing object of type User.

val user = User(10, Email(""), Phone(1234567890L))
val updateForm = UserUpdateForm("", 123123123L)

// User(10, Email(""), Phone(123123123L))

Notice that when using patchers, we rely on standard transformers derivation rules. In this case we used value classes in the User model, but plain values in update form. Chimney was able to derive transformers for each patched field, so it was able to successfully derive a patcher.

Redundant fields in patch

When patch case class contains a field that does not exist in patched object, Chimney will not be able to generate patcher.

case class UserUpdateForm2(email: String, phone: Long, address: String)

user.patchUsing(UserUpdateForm2("", 123123123L, "some address"))
// Field named 'address' not found in target patching type User

This default behavior is intentional to prevent silent oversight of typos in patcher field names.

But there is a way to ignore redundant patcher fields explicitly.

  .using(UserUpdateForm2("", 123123123L, "some address"))
// User(10, Email(""), Phone(123123123L))

Optional patch values

It is possible to patch using optional values of type Option[T] as long as the transformer is available for T. If the value is present (Some), it's used for patching field in target object; otherwise (None) it's' ignored and the field value is copied from the original object. Let us consider the following patch for the class User defined above:

case class UserPatch(email: Option[String], phone: Option[Phone])

Then it is possible to patch as follows:

val update = UserPatch(email = Some(""), phone = None)

//  User(10, Email(""), Phone(1234567890L))

The phone remained the same as in the exampleUser, while the optional e-mail string got transformed to an Email instance.

Option[T] on both sides

An interesting case appears when both patch case class and patched object define fields f: Option[T]. Depending on values of f in patched object and patch, we would like to apply following semantic table.

patchedObject.f patch.f patching result
None Some(value) Some(value)
Some(value1) Some(value2) Some(value2)
None None None
Some(value) None None or Some(value)?

When a patch.f contains some value, it's immediately used for replacing field in target object (rows 1 and 2), regardless of original object field value. When both field are Nones, patching result is also None (row 3).

But if original object contains a some value, but patch comes with a None, we can do two things:

Both choices may have perfect sense, depending on the context. By default, Chimney does the former (clears the value), but it also gives a simple way to always ignore Nones from patch.

case class User(name: Option[String], age: Option[Int])
case class UserPatch(name: Option[String], age: Option[Int])

val user = User(Some("John"), Some(30))
val userPatch = UserPatch(None, None)

// clears both fields: User(None, None)

// ignores updating both fields: User(Some("John"), Some(30))

Java beans

Beside Scala case classes, Chimney supports transformation of Java beans.

Reading from Java beans

Chimney supports automatic field renaming for classes that follow Java beans naming convention. Let's assume the following classes:

class MyBean(private var id: Long,
             private var name: String,
             private var flag: Boolean) {
    def getId: Long = id
    def getName: String = name
    def isFlag: Boolean = flag

case class MyCaseClass(id: Long, name: String, flag: Boolean)

The conversion works if you explicitly enable it with .enableBeanGetters:

new MyBean(1L, "beanie", true)
//  MyCaseClass(1L, "beanie", true)

Please note that Chimney matches accessor methods solely based on name and return type, and has no way of ensuring that a method named similarly to a getter is idempotent and does not actually perform side effects in its body.

Writing to Java beans

Dual to reading, Chimney supports transforming types into Java beans.

Chimney considers as bean a class that:

Chimney will then require data sources for all such setters.

class MyBean {
  private var id: Long = _
  private var name: String = _
  private var flag: Boolean = _

  def getId: Long = id
  def setId(id: Long): Unit = { = id }

  def getName: String = name
  def setName(name: String): Unit = { = name }

  def isFlag: Boolean = flag
  def setFlag(flag: Boolean): Unit = { this.flag = flag }

The conversion works if you explicitly enable it with .enableBeanSetters:

val obj = MyCaseClass(10L, "beanie", true)
val bean = obj

Chimney generates code equivalent to:

val bean = new MyBean


Currently it's not possible to override or provide values for missing setters.

Unsafe option

Chimney supports opt-in unsafe transformation from optional types into non-optional types.

This mode is enabled explicitly with .enableUnsafeOption. Transforming None into a concrete value will lead to NoSuchElementException at runtime, so use at your own risk.

Mapping proto3 types

Unsafe option mode is typically useful when mapping proto3-generated classes to domain classes. Scalapb indeed generates fields wrapped in Option types for messages. In certain scenarios, it can be safe to assume that the value is always present, thus allowing for significant boilerplate reduction.

Here's an example protobuf definition:

syntax = "proto3";
package pb;
message Item {
    int32 id = 1;
    string name = 2;
message OrderLine {
    Item item = 1;
    int32 quantity = 2;
message Address {
    string street = 1;
    int32 zip_code = 2;
    string city = 3;
message Customer {
    int32 id = 1;
    string first_name = 2;
    string last_name = 3;
    Address address = 4;
message Order {
    repeated OrderLine lines = 1;
    Customer customer = 2;

And the equivalent domain model definitions:

package domain

case class Item(id: Int, name: String)
case class OrderLine(item: Item, quantity: Int)
case class Address(street: String, zipCode: Int, city: String)
case class Customer(id: Int, firstName: String, lastName: String, address: Address)
case class Order(lines: List[OrderLine], customer: Customer)

Transforming from one representation to the other can be achieved directly using .enableUnsafeOption:

val domainOrder = pbOrder.into[domain.Order].enableUnsafeOption.transform

and vice-versa:

val pbOrder = domainOrder.into[pb.Order].enableUnsafeOption.transform