Spark: Convert String to spark.sql.types object in scala

1k Views Asked by At

I have a Array of String {"StringType", IntegerType, "LongType", "StringType"} in scala. And i need to covert each String to spark.sql.types object while iterating

for example:

StringType = spark.sql.types.StringType
IntegerType = spark.sql.types.IntegerType
LongType = spark.sql.types.LongType. 

One solution is to create 1 to 1 HashMap of String and spark.sql.types and use it while iterating the array. Is there any other cleaner way to do this?

1

There are 1 best solutions below

0
Glennie Helles Sindholt On

I would probably just use scala pattern matching. Like:

import org.apache.spark.sql.types._

val typeList = inputArray.map(s => {
  s match {
    case "StringType" => StringType
    case "IntegerType" => IntegerType
    etc...
    case _ => throw new RuntimeException("unknown type")  
  }
})