Job are not shown on Spark WebUI

836 Views Asked by At

I a naive user of spark. I installed spark and using anaconda install pyspark, then run a basic code in the jupyter notebook that is given below. I then open the spark WebUI however I am unable to see any jobs either running or completed. Any comments are appreciated.

from pyspark.sql import SparkSession
spark = SparkSession.builder\
    .master("local")\
    .appName("NQlabtop")\
    .config('spark.ui.port', '4050')\
    .getOrCreate()
sc = spark.sparkContext
input_file=sc.textFile("C:/Users/nqazi/NQ/anscombe.json")
map = input_file.flatMap(lambda line: line.split(" ")).map(lambda word: (word, 1))
counts = map.reduceByKey(lambda a, b: a + b)
print("counts",counts)
sc = spark.sparkContext
data = [1, 2, 3, 4, 5]
distData = sc.parallelize(data)

Please see the image of the Spark WebUI below. I am not sure why I cannot see any of the jobs as I think it should display completed the jobs.

enter image description here

1

There are 1 best solutions below

4
On

There two types of functions in PySpark (Spark) transformations and actions. Transformations are lazily evaluated and PySpark doesn't perform any jobs until you call an action function like show, count, collect etc.