Scala .par function with Spark 2.2.1 gives Janino compilation error

270 Views Asked by At

I have some code like below.

def computeGroupByCount(Columns: List[String], DF: DataFrame): List[JsValue] = {

  val result: ListBuffer[JsValue] = new ListBuffer[JsValue]()
  val encoder: Encoder[ColumnGroupByCount] = Encoders.product[ColumnGroupByCount]

  groupByCountColumns.par.foreach(colName => {
    val groupByCount: Array[ColumnGroupByCount] = DF
      .groupBy(colName)
      .count()
      .map(x => ResponseOnGroupByCount(colName.toString, x.getString(0), x.getLong(1)))(encoder)
      .collect()
    result += Json.toJson(groupByCount)
  })
  result.toList
}

When i run this code It is giving below error.But it is working in IntelliJ

[info]   Cause: org.codehaus.janino.InternalCompilerException: Class 'org.apache.spark.sql.catalyst.expressions.codegen.GeneratedClass' was loaded through a different loader

and throwing weird code error

Please Help me on this.

0

There are 0 best solutions below