I need to create an hourly .SQB backup file of some specific tables, each filtered with a WHERE clause, from a SQL Server database. As an example, I need this data:
SELECT * FROM table1 WHERE pk_id IN (2,5,7)
SELECT * FROM table2 WHERE pk_id IN (2,5,7)
SELECT * FROM table3 WHERE pk_id IN (2,5,7)
SELECT * FROM table4 WHERE pk_id IN (2,5,7)
The structure of the tables on the source database may change over time, e.g. columns may be added or removed, indexes added, etc.
One option is to do some kind of export, script generation, etc. into a staging database on the same instance of SQL Server. Efficiency aside, I have no problem dropping or truncating the tables on the destination database each time. In short, I'm looking to have both the schema and data of the tables duplicated to the destination database. That's completely acceptable.
Another is to just create a .SQB backup from the source database. Being that the .SQB file is all that I really need (it's going to be sent SFTP) - that would be fine, too.
What's the recommended approach in this scenario?
Well if I understand your requirement correctly, you want data from some tables from your database to be shipped over to somewhere else periodically.
Thing that is not possible in SQL server is taking a backup of a subset of tables from your database. So, this is not an option.
Since you have mentioned you will be using SFTP to send the data, using BCP command to extract data is one option, but BCP command may or may not perform very well and it definitely will not scale-out very well.
Again there are a few options with the replication subscriber database.
I short there is more than one way to skin the cat, now you have to decide which one suits your requirements best.