I have a table A with a large number of rows and an id column. Additionally, I have a variable @Json that can have three types of values: null, an empty array '[]', or a JSON array like '[1,2,3]'. I am looking for the optimal way to filter the rows of table A based on the value of the @Json variable.
If @Json is null, I want to retrieve all rows from table A. If @Json is an empty array '[]', I don't want to retrieve any rows from table A. If @Json contains a JSON array, I want to retrieve only the rows where the id column value exists in that array.
Example:
+----+
| id |
+----+
| 1 |
| 2 |
| 3 |
| 4 |
| 5 |
+----+
Cases:
@Json NVARCHAR(MAX) = NULL; - Should fetch all rows
@Json NVARCHAR(MAX) = '[]'; - Should return empty
@Json NVARCHAR(MAX) = '[1,2,3]'; - Should return rows with Id In (1,2,3)
I can't use temp tables and CTE, because I need to filter the data in a subquery
I would appreciate any help in determining the optimal and efficient way to filter the rows of table A based on the variable JSON array.
I resolve my problem with
DECLARE @Json NVARCHAR(max) = null
SELECT *
FROM A a
LEFT JOIN OPENJSON(@Json) WITH(id INT '$') J ON a.id = J.id
WHERE @Json is null or J.Id is not null
I want to ensure that this is the optimal approach. Are there any other methods for filtering the rows of table A considering the specified requirements?
ORconditions wreak havoc on the query optimizer. In this case, I would suggest aUNION ALLof two easily optimized queries, where one handles the null case, and the other handles the non-null case (zero or more IDs). The zero IDs case just naturally works.Something like:
If your select list is actually a complex list (not
a.*) or if you have additional joins, you can wrap the above as a subquery and define your select list and joins at the outer level.