Is there a way to call scala lamda function from the pyspark?

149 Views Asked by At

I have a lambda function define in scala.

package com.stack.flow.info

object ObjForExp{

val getColumn: String => Column = (columnName) =>  {
    col(columnName)
  }

def cool(t:String):Unit={
    print(t)
}

}

I want to call it using pyspark.

I know I can pass the jar as.

pyspark --jars sample.jar

Also, I know I can call method cool as.

sc._jvm.com.stack.flow.info.ObjForExp.cool("What Ever")

But is there is a way to call scala lambda function to pyspark for e.g. getColumn ?

What I know is that py4j acts as an actual spark java code so there is a chance that py4j doesn't have any idea how to handle.

0

There are 0 best solutions below