I have this Linq-to-SQL query
DaynamicContext.Log = new System.IO.StreamWriter("f:\\linq-to-sql.txt") { AutoFlush = true })
var tt = DaynamicContext.GetTable(tbl);
var query = ((from c in tt where c.IsDeleted != true select c.Id).Take(1)).ToList();
The result of final query is correct is got only single Id.
The problem when I have big data I got out of memory exception.
When I checked the generated query
SELECT [t0].[Id], [t0].[CreatedBy], [t0].[CreatedDate], [t0].[ModifiedBy],
[t0].[ModifiedDate], [t0].[IsDeleted], [t0].[UntileTime], [t0].[Desktop],
[t0].[Laptop], [t0].[Title], [t0].[Responsive], [t0].[Mobile], [t0].[ActiveTime],
[t0].[Tablet]
FROM [Countent_FreeArea] AS [t0]
it seems like Linq-to-SQL is getting all data from database and filtering it on memory.
public class Context
{
private DataContext daynamicContext;
public DataContext DaynamicContext
{
get
{
if (daynamicContext == null)
{
System.Configuration.ConnectionStringSettingsCollection connectionStrings = WebConfigurationManager.ConnectionStrings;
daynamicContext = new DataContext(connectionStrings["connectionStrings"].ToString());
}
return daynamicContext;
}
}
}
The GetTable method will fetch the whole table first and then select the first value from the list. Also instead of using
Take()
, useFirstOrDefault()
.