Recommend several items with sparkR ALS

116 Views Asked by At

I´m following the sparkR example for ALS:

# Load training data
data <- list(list(0, 0, 4.0), list(0, 1, 2.0), list(1, 1, 3.0),
             list(1, 2, 4.0), list(2, 1, 1.0), list(2, 2, 5.0))
df <- createDataFrame(data, c("userId", "movieId", "rating"))
training <- df
test <- df

# Fit a recommendation model using ALS with spark.als
model <- spark.als(training, maxIter = 5, regParam = 0.01, userCol = "userId",
                   itemCol = "movieId", ratingCol = "rating")

# Model summary
summary(model)

# Prediction
predictions <- predict(model, test)
head(predictions)

Which works fine, but I´m having the following issue:

How do I specify the number of items to be recommend?

In the python example it is quite clear:

movieSubSetRecs = model.recommendForItemSubset(movies, 10)

But for sparkR I´m not finding that.

Also, I can not change to sparklyr, it has to be done with sparkR

1

There are 1 best solutions below

0
On

SparkR developers did not provide access to this method.They uses ALSWrapper to hide most of methods. So here is the hack to call this method.

# get alsModel from wrapper and call recommendForAllUsers method from it
jdf<-sparkR.callJMethod( sparkR.callJMethod(model@jobj, "alsModel"), "recommendForAllUsers", as.integer(10))
# create R object from java object, is there better way?
df <- new("SparkDataFrame", jdf, FALSE)