I'm trying to import types from spark sql as follows
import org.apache.spark.sql.types._
But I get an errors such as: "not found: value DataType", "not found: type ByteType"
The full code is
import java.math.BigDecimal
import java.sql.{Timestamp, Date}
import org.apache.spark.sql.types._
/**
* Utility functions for type casting
*/
object TypeCast {
/**
* Casts given string datum to specified type.
* Currently we do not support complex types (ArrayType, MapType, StructType).
*
* @param datum string value
* @param castType SparkSQL type
*/
def castTo(datum: String, castType: DataType): Any = {
castType match {
case _: ByteType => datum.toByte
case _: ShortType => datum.toShort
case _: IntegerType => datum.toInt
case _: LongType => datum.toLong
case _: FloatType => datum.toFloat
case _: DoubleType => datum.toDouble
case _: BooleanType => datum.toBoolean
case _: DecimalType => new BigDecimal(datum.replaceAll(",", ""))
case _: TimestampType => Timestamp.valueOf(datum)
case _: DateType => Date.valueOf(datum)
case _: StringType => datum
case _ => throw new RuntimeException(s"Unsupported type: ${castType.typeName}")
}
}
ByteType etc are not types, but singleton case objects.
So you probably want something like this:
(except that at least in my version of Spark, there's no DecimalType, and castType doesn't have a typeName field)