Not long ago,
when I input pyspark
in my terminal.
the terminal will finally become...um...like this:
some information
>>>
but now it start with jupyter notebook automatically.
This phenomenon happened with spark-3.0.0-preview2-bin-hadoop3.2
I have used many version of spark.
Is above phenomenon due to my error in configuration or due to spark edition update?
Thanks for your help.