map transformation on Spark paired rdd

62 Views Asked by At

I applied map transformation on below paired RDD:

sc.parallelize(List((1,10),(2,20),(3,30),(4,40))); 

with two different signatures.

case 1:

res0.map({case (x,y)=>(x,y+1)}).collect;

which gives below result:

Array[(Int, Int)] = Array((1,11), (2,21), (3,31), (4,41))

case 2:

res0.map(case (x,y)=>(x,y+1)).collect;

Which gives below error:

error: illegal start of simple expression

May I know the reason for case 2 failure, just flower bracket is the difference between two cases.

Thanks in advance.

1

There are 1 best solutions below

0
On

Case requires {}. In general { case arg => logic } is what is known as a Partial Function in Scala.

You can also do this BTW:

val y = RDD.map(a=> (a._1, a._2 + 1)).collect