I am seeing below error when joining two hive tables using pyspark hive context .
error:
""") File "/usr/hdp/2.3.4.7-4/spark/python/lib/pyspark.zip/pyspark/sql/context.py", line 552, in sql File "/usr/hdp/2.3.4.7-4/spark/python/lib/py4j-0.8.2.1-src.zip/py4j/java_gateway.py", line 538, in call File "/usr/hdp/2.3.4.7-4/spark/python/lib/pyspark.zip/pyspark/sql/utils.py", line 36, in deco File "/usr/hdp/2.3.4.7-4/spark/python/lib/py4j-0.8.2.1-src.zip/py4j/protocol.py", line 300, in get_return_value py4j.protocol.Py4JJavaError: An error occurred while calling o41.sql. : org.apache.spark.SparkException: Job cancelled because SparkContext was shut down EX:
lsf.registerTempTable('temp_table')
out = hc.sql(
"""INSERT OVERWRITE TABLE AAAAAA PARTITION (day ='2017-09-20')
SELECT tt.*,ht.id
FROM temp_table tt
JOIN hive_table ht
ON tt.id = ht.id
""")
Also how to parameterize day ?