ImportError: cannot import name 'HiveContext' from 'pyspark.sql'

1.5k Views Asked by At

I am running pyspark in my PC (windows 10) but I can not import HiveContext:

from pyspark.sql import HiveContext
---------------------------------------------------------------------------
ImportError                               Traceback (most recent call last)
<ipython-input-25-e3ae767de910> in <module>
----> 1 from pyspark.sql import HiveContext

ImportError: cannot import name 'HiveContext' from 'pyspark.sql' (C:\spark\spark-3.0.0-preview-bin-hadoop2.7\python\pyspark\sql\__init__.py)

How I should proceed to resolve it?

1

There are 1 best solutions below

2
Oliver W. On BEST ANSWER

You’re using the preview release of Spark 3.0. According to the release notes, you should use SparkSession.builder.enableHiveSupport().

In Spark 3.0, the deprecated HiveContext class has been removed. Use SparkSession.builder.enableHiveSupport() instead.