I'm new to sparklyr (but familiar with spark and pyspark), and I've got a really basic question. I'm trying to filter a column based on a partial match. In dplyr, i'd write my operation as so:
businesses %>%
filter(grepl('test', biz_name)) %>%
head
Running that code on a spark dataframe however gives me:
Error: org.apache.spark.sql.AnalysisException: Undefined function: 'GREPL'. This function is neither a registered temporary function nor a permanent function registered in the database 'project_eftpos_failure'.; line 5 pos 7
The same as in standard Spark, you can use either
rlike
(Java regular expressions):or
like
(simple SQL regular expressions):Both methods can be also used with suffix notation as
and
respectively.
For details see
vignette("sql-translation")
.