Comparing Two Methods for Loading Data from SQL Server to Azure Cache for Redis

41 Views Asked by At

My goal is to transfer a million records from my SQL Server database to an Azure Cache for Redis. I cannot do this in one go, so I have split the SET operation into smaller batches. I am looking for the best method to manage this transfer.

Option 1: I create a simple C# application that loads each record into Redis as shown in the code snippet.

using StackExchange.Redis;

class Program
{
    static void Main()
    {
        string cacheConnection = "your_connection_string_here";

        // Connect to Azure Redis Cache
        ConnectionMultiplexer connection = ConnectionMultiplexer.Connect(cacheConnection);
        IDatabase cache = connection.GetDatabase();

        // loading data from SQL Server
        var dataFromSqlServer = GetRowsFromSqlServer();

        // Cache each row individually
        foreach (var row in dataFromSqlServer)
        {
            cache.StringSet($"myKey:{row.Id}", row.Value);
        }

        Console.WriteLine("Data cached successfully!");
    }
}

The problem is what happens if my application fails halfway. Do I have to start over? I could use a filter based on the creation date of the SQL Server record and save checkpoints, but I think there is a better way.

Option 2: I use Azure Data Factory to load the million records and then call Redis API for each record. Are there any disadvantages to using ADF?

0

There are 0 best solutions below