out of memory when using .asJson in big object

53 Views Asked by At

I use scala, circe. I have an object that contains very big list. what i already tried: when i encode it to json it leads to out of memory. Is there a solution how can i proceed using iterators, akka reactive streams ?

1

There are 1 best solutions below

0
Victor Nguen On

You can process encoding/decoding JSON or any other computation on a large collection using any streaming library, such as Akka Streams, depending on the technology stack used in your project.

Here is an example how to encode data to JSON with circe(version 0.14.5) and Akka Streams(version 2.8.0):

import akka.actor.ActorSystem
import akka.stream.scaladsl._
import io.circe.Json
import io.circe.generic.auto._
import io.circe.syntax._

object CirceParseBigCollection extends scala.App {
  case class MyData(field1: String, field2: Int)

  val bigList = LazyList.from(1).take(100000).map { i =>
    MyData(s"item-$i", i)
  }

  implicit val system: ActorSystem                           = ActorSystem()
  implicit val ec: scala.concurrent.ExecutionContextExecutor = system.dispatcher

  // Create an encoder for MyData
  val source = Source(bigList)

  val flow = Flow[MyData].map { obj =>
    val json = obj.asJson
    Thread.sleep(100) // any additional computations here
    json
  }

  val sink = Sink.foreach[Json](json => println(json))

  source.via(flow).to(sink).run()

}