Method that fetch data from multiple databases running slow

158 Views Asked by At

I have some method that needs to fetch statistics data from multiple databases. The key idea is that each table hold a DBName and from it I drill down to the Client Main DB calling a stored proc with the desired database name. finally I drill down a second time to get data from the client's Project Database.

To sum it up:

  1. I get the list of all my cloud users.

  2. For each User I fetch his Clients by using stored proc on his main DB -> Marked as userClients.

  3. For each Client I fetch his Statistic by using stored proc on the Clients Project DB.

It takes about 5-6 secs to execute for very little data.

public List<CloudAnalysisDTO> GetCloudAnalysisForPeriod(DateTime FromDate, DateTime ToDate)
    {
        var users = FindAll();
        List<CloudAnalysisDTO> resultsList = new List<CloudAnalysisDTO>();
        HashSet<string> userclients = new HashSet<string>();

        using (var db = new ProjSQLDataContext(conn))
        {
            foreach (var user in users)
            {
                if (user.ID == 0)
                    continue;

                var ids = string.Join(",", db.UserClients.Where(uc => uc.UserId == user.ID).Select(uc => uc.ClientId.ToString()).ToArray());
                var mainDB = user.MainDB;

                if (mainDB.Length == 0 || ids.Length == 0)
                    continue;

                List<CloudAnalysisDTO> userClients =
                        db.ExecuteQuery<CloudAnalysisDTO>(@"EXEC CloudUsersAnalysis {0},{1}", mainDB, ids).ToList<CloudAnalysisDTO>();

                List<CloudAnalysisDTO> needRemove = new List<CloudAnalysisDTO>();

                foreach (var client in userClients)
                {
                    if (!userclients.Contains(user.MainDB + client.ClientID.ToString()))
                        userclients.Add(user.MainDB + client.ClientID.ToString());
                    else
                    {
                        needRemove.Add(client);
                        continue;
                    }

                    ClientAnalysisDTO clientAnalysisDTO =
                    db.ExecuteQuery<ClientAnalysisDTO>(@"EXEC CloudClientAnalysis {0},{1},{2}", client.ProjectDB, FromDate, ToDate).SingleOrDefault<ClientAnalysisDTO>();

                    if (clientAnalysisDTO != null)
                    {
                        client.ClientAnalysisDTO = clientAnalysisDTO;
                    }

                    client.UserID = user.ID;
                    client.MainDB = user.MainDB;
                }

                foreach (var removeDTO in needRemove)
                {
                    userClients.Remove(removeDTO);
                }

                if (userClients != null && userClients.Count > 0)
                    resultsList.AddRange(userClients);
            }



        }
        return resultsList;
    }

Any ideas of what I can do to improve performance ?

1

There are 1 best solutions below

3
On

First thing I would do is enable .NET tracing, and write a line to the tracelog before and after each call.

https://msdn.microsoft.com/en-us/library/zs6s4h68(v=vs.110).aspx

This line makes me suspicious that you might be secretly running an "in" clause in one of the queries, which might be less-than performant:

 var ids = string.Join(",", db.UserClients.Where(uc => uc.UserId == user.ID).Select(uc => uc.ClientId.ToString()).ToArray());

The next step once you find the weak performer (the above line is just my guess), you should enable database profiling to determine where there needs to be new indexing or database maintenance.