If the Webjob request is limited, do not process any additional data at startup or if the interrupt does not continue to process the queue

I have a continuous webjob that also starts with a timer. I removed all the extraneous code to give the program flow diagram.

A program class has a network and calls functions.

public class Program { static void Main() { var config = new JobHostConfiguration(); config.UseTimers(); var host = new JobHost(config); host.RunAndBlock(); } public static void ProcessMyQueue([TimerTrigger("00:00:30", RunOnStartup = true)] TimerInfo timer) { WebJob1.ProcessMyQueue.ProcessPollEvent(); } 

I have such classes. The query is selected from db depending on certain conditions, and this works. This calls a processing function that modifies the data. It is after this data change that the data found in the request in the ProcessPollEvent will be changed. those. being processed.

 public class ProcessMyQueue { public static void ProcessPollEvent() { using (var db = new dataC()) { var query = (some query // checking on condtions) .Take(10) .ToList(); } // aList is created. var threads = aList.Select(rq => Task.Run(() => ProcessRequest(rq))).ToList(); Task.WaitAll(threads.ToArray()); } public static void ProcessRequest(int rqId) { var token = new WebJobsShutdownWatcher().Token; using (var db = new dataC()) { if(token.IsCancellationRequested) return; // In here change and save condition that is being selected upon. db.SaveChanges(); } } } 
  • By running it with a reception limit, it will not process more data the next time it is triggered.

  • When I run this task without the ieTake (10) restriction, it will work fine, albeit slowly, if the website is stopped and reloaded during its launch, it will not process more data.

Thus, in a real-life scenario, more than 100,000 rows of data are processed, and if the website is restarted, therefore, if the project is redistributed, the rest of the data is not processed. These rows are forever increasing in size, and I can reach more than 1,000,000.

I looked at other questions and read, and I do not understand how to fix this.

I would think that restricting the request with take will speed up the process. But I can not get the webjob to process the data again.

I use several tasks, requesting data, and they are the same for everyone.

+1
source share

Source: https://habr.com/ru/post/1212064/


All Articles