I have an application with spark MLlib-scala, I want to split my data on 3 parties: Training, test, validation. My code is the following:
val training_RDD = Ratingfiles.filter(x => x._1 < 6)
.values
.cache()
val validation_RDD = Ratingfiles.filter(x => x._1 >= 6 && x._1 < 8)
.values
.cache()
when I compile my program with sbt compile, I have this error:
value _1 is not a member of org.apache.spark.mllib.recommendation.Rating
Spark-core: 1.4.1 Spark-MLlib:2.0.1 Scala version: 2.11.1 Sbt version: 0.13.12
As the compiler claims,
org.apache.spark.mllib.recommendation.Rating
does not have a member called_1
(you're probably confusing it with a Tuple, for which the members are_1
,_2
etc.).Rating
has three members:So - if you mean to be filtering by
user
, simply access that member instead of_1
: