Is there a way to use CROSS APPLY from SQL to Spark SQL?

742 Views Asked by At

I have complex stored procedure that uses multiple views/functions inside and inside there are multiple cross applies and I am not sure if there is a "easy solution" to replicate it in spark.sql

df = spark.sql(f""" select * from table CROSS APPLY ( some business rule ) """)

I checked the documentation: https://docs.databricks.com/en/sql/language-manual/sql-ref-syntax-qry-select-join.html but didn`t found what I want

1

There are 1 best solutions below

0
Alejandro23 On BEST ANSWER

You must first of all enable this option in your notebook

spark.sql("SET spark.sql.crossJoin.enabled=true")