Vote count:
0
I'm working on speeding up a Real Estate Mangement portal that was poorly designed by a young team using ORM for the first time.
Among the low hanging fruit is a method inside an overridden SaveChanges() method:
public virtual int SaveChanges(Guid userId)
{
List<AuditLog> LogsToSave = new List<AuditLog>();
foreach (var ent in this.ChangeTracker.Entries().Where(p => p.State == System.Data.Entity.EntityState.Added
|| p.State == System.Data.Entity.EntityState.Deleted || p.State == System.Data.Entity.EntityState.Modified))
{
DateTime changeTime = DateTime.UtcNow;
try
{
ent.Property("ModifiedOn").CurrentValue = changeTime;
ent.Property("ModifiedBy").CurrentValue = userId;
}
catch (Exception exception)
{
//swallow exception
}
LogsToSave.AddRange(GetAuditRecordsForChange(ent, userId, changeTime));
}
//This method works on small data sets, less than 200
System.Threading.Tasks.Task.Run(() => this.BulkInsert<AuditLog>(LogsToSave));
//This method works on data sets of all sizes, but is synchronous
this.BulkInsert<AuditLog>(LogsToSave);
The question comes in the last two lines:
If I call the method with a simple Task.Run I achieve the desired effect, but on large data sets I run into a deadlock event. I assume this is because threads begin to wait too long to execute.
I would love to be able to make all operations for AuditLog BulkInsert asynchronous, but I've never designed and built something for this.
Is there some kind of a Queue that I can build to execute 100 requests at a time? Some kind of a system for this situation?
I feel comfortable building something, but I assume there is a solid, battle tested pattern that does exactly what I want.
So far I can't find it.
Threadsafe Async Log reporting to SQL from Entity
Aucun commentaire:
Enregistrer un commentaire