Currently we have a Datawarehouse that is holding data from multiple tenants. SQL server is on version 2019. Same schema for all the tenant databases and the data from all the tenants is consolidated in the Datawarehouse. Data is partitioned in the datawarehouse on Tenant basis. We have parameter sniffing problem with the new dashboard as the data varies a lot between the tenants. Some tenants have data less than 10000 rows and a couple of tenants have data ranging up to 5 million rows. Due to this, dashboard performance is bad for large tenants if the execution plan is built based on a smaller tenant.
Suggestions on the internet are available asking to use Recompile hint or Optimize for hint etc. But I have a doubt on the basics of this parameter sniffing. As statistics are maintained by the SQL server at partition level, is this statistics information not used to see if the plan built is right for a new run time value? Before executing, are stats ever compared for the plans built on compile time and run time to see if they are valid and the associated plan is valid?
Kindly advise.
Parameters are for when you want shared, reused query plans. Hard-coding the criteria that cause query plans to vary is the basic right answer here.
And even though "As much as possible, we are refraining from using Dynamic SQL in the code", you should make an exception here.
If you don't end up spending too much time in query optimization, this is almost as good. Or