Trying to enforce classes that extend W to have a method get that returns a Dataset of a subclass of a WR.
abstract class WR
case class TGWR(
a: String,
b: String
) extends WR
abstract class W {
def get[T <: WR](): Dataset[T]
}
class TGW(sparkSession: SparkSession) extends W {
override def get[TGWR](): Dataset[TGWR] = {
import sparkSession.implicits._
Seq(TGWR("dd","dd").toDF().as[TGWR]
}
}
Compilation error:
Unable to find encoder for type stored in a Dataset. Primitive types (Int, String, etc) and Product types (case classes) are supported by importing spark.implicits._ Support for serializing other types will be added in future releases.
If I change the get function to following:
def get(): Dataset[TGWR]
and
override def get(): Dataset[TGWR] = {...
it compiles - therefore I suspect a problem due to inheritance/type hierarchy.
Forget my comment, I re-read your question and noticed a simple problem.
Here
override def get[TGWR]you are not saying that this class produces instances ofTGWR, but you are creating a new type parameter of nameTGWR, that will shadow your real type.I fixed it with the following code:
That you can use right this:
Hope this is what you are looking for.
Do not doubt to ask for clarification.