I around have 400,000 records which is coming from file. But after inserting 30000 to 33000 of records it is throwing error 'System.OutOfMemoryException'
In my system i have 8gb of ram i think it is enough for this. And i inserting bunch of records like 500 records at a time using this code.
this._context.AutoDetectChangesEnabled = false;
if(Counter % 500 == 0)
_context.SaveChanges();
i tried all possible changes and do lots of r & d and also find all possible solution on this stack overflow site but it can't help.
I am using nopcommerce basically for this context using
private readonly IDbContext context;
And also let me know if any confusion!
public partial class EfRepository<T> : IRepository<T> where T : BaseEntity
{
private IDbContext _context;
private IDbSet<T> _entities;
protected virtual IDbSet<T> Entities
{
get
{
if (_entities == null)
_entities = _context.Set<T>();
return _entities;
}
}
public virtual void Insert(IEnumerable<T> entities, bool enableTrackChanges = true)
{
try
{
if (entities == null)
throw new ArgumentNullException("entities");
if(!enableTrackChanges)
{
this._context.AutoDetectChangesEnabled = false;
}
foreach (var entity in entities)
this.Entities.Add(entity);
this._context.SaveChanges();
}
catch (DbEntityValidationException dbEx)
{
throw new Exception(GetFullErrorText(dbEx), dbEx);
}
finally
{
if (!this._context.AutoDetectChangesEnabled)
this._context.AutoDetectChangesEnabled = true;
}
}
}
You need to recreate the context in order to avoid outofmemoryexception. The memory will increase until entity framework can't handle it and then the exception is thrown. It is a memory limitation in entity framework so it won't help to run on an even better computer.
This should work:
Also see this for some more help on inserting many rows with EF. Fastest Way of Inserting in Entity Framework
Edit: I apologize for being unclear and not citing sources. The limitation of an object in .NET is 2GB, arrays can be bigger on 64bit machines but an EF context cannot be bigger than 2GB. And when using an EF context continuously the memory usage may increase rapidly until the limit is reached. To free memory it is therefore best to renew the context after a while when inserting many rows of data. So the limit is on .NET, not on EF as I mistakenly said before.
About EF context: https://msdn.microsoft.com/en-us/library/jj729737(v=vs.113).aspx
How to get >2gb arrays (also says that other objects are unchanged) https://learn.microsoft.com/en-us/dotnet/framework/configure-apps/file-schema/runtime/gcallowverylargeobjects-element
Old longer source about 2GB limit and larger arrays: https://blogs.msdn.microsoft.com/joshwil/2005/08/10/bigarrayt-getting-around-the-2gb-array-size-limit/