Spark 2.2 extracting date not working from unix_timestamp

3.8k Views Asked by At

In Spark 2.2 extracting date not working from unix_timestamp Input Data:

+-------------------------+
|UPDATE_TS                |
+-------------------------+
|26NOV2009:03:27:01.154410|
|24DEC2012:00:47:46.805710|
|02MAY2013:00:45:33.233844|
|21NOV2014:00:33:39.350140|
|10DEC2013:00:30:30.532446|

I tried following approaches but output Im getting as null

Query Tired:

Spark sql

sqlContext.sql("select from_unixtime(unix_timestamp(UPDATE_TS,'ddMMMyyyy:HH:MM:SS.ssssss'), 'yyyy') as new_date from df_vendor_tab").show()

DSL:

 df_id.withColumn('part_date', from_unixtime(unix_timestamp(df_id.UPDATE_TS, "ddMMMyyyy:HH:MM:SS.sss"), "yyyy"))

expected output:

2009
2012
2013
2014
2013
2

There are 2 best solutions below

3
On BEST ANSWER

You're using the incorrect format string. Capital M is for month. Lower case m is for minute.

The following would work:

from pyspark.sql.functions import from_unixtime, unix_timestamp, to_date

df_id.withColumn(
    'part_date', 
    from_unixtime(unix_timestamp(df_id.UPDATE_TS, "ddMMMyyyy:HH:mm:SS.SSSSSS"), "yyyy")
).show(truncate=False)
#+-------------------------+---------+
#|UPDATE_TS                |part_date|
#+-------------------------+---------+
#|26NOV2009:03:27:01.154410|2009     |
#|24DEC2012:00:47:46.805710|2012     |
#|02MAY2013:00:45:33.233844|2013     |
#|21NOV2014:00:33:39.350140|2014     |
#|10DEC2013:00:30:30.532446|2013     |
#+-------------------------+---------+
1
On

Simple spark-sql is working fine with unix_timestamp and from_unixtime

sqlContext.sql("Select from_unixtime(unix_timestamp('26NOV2009:03:27:01.154410', 'ddMMMyyyy'), 'yyyy')").show

Output:

+----+
| _c0|
+----+
|2009|
+----+

Since you are looking for extracting year, i have not considered hour, min, secs...