The timer works from global.asax vs quartz.net

I am developing an asp.net site that daily clicks on several social networking sites to get common friends / followers. I chose the arvixe business class as my hosting. In the future, if we grow up, I would like to get to a dedicated server and start the Windows service, however, since there are no maps on this, I need one more reliable way to start scheduled tasks. I am familiar with starting a thread timer from app_code (global.aspx). However, disposing of the application pool will cause some problems with the timer. I never used task scheduling like quartz, but I read a lot about it in stackoverflow. I was looking for some tips on how to approach my goal. One of the big problems that I am facing is that I need the crawler threads to sleep for an hour regularly from outside the api call. My first thoughts were to use db to keep the start and end of work. When I process the application pool, I would clear all the parts that were not completed, and only the initial parts that do not have a job record for that day. What do experts think? Any good references to the typical architecture of this type of planning?

+4
source share
4 answers

It doesn't matter which method you use, whether you use your own or use Quartz. You are at the mercy of ASP.NET/IIS because where you want to host it.

Do you have a spare computer that can run a scheduled task and upload data to a hosted database? To be honest, it may be safer (depending on your use case) to just do this and then try to run the scheduler in ASP.NET.

+1
source

To some extent along the Brian line;

Find a backup computer.

Instead of allowing access to the database, it calls a web service on your site. This utility call should be the initiator of the process you are trying to make. Do not try to insert parameters into it, just something like "StartProcess ()" should work fine.

For sleep and renewal, take a look at the Workflow Foundation . There are several useful built-in functions for saving state.

Do not expose your database to the outside world, and do not open this page or web service and share some protection. WCF has some useful security features for this.

The best part is when you decide to switch, you can save your web service and call it from a Windows service in the same way.

0
source

As long as you use a persistent job repository (like a database), and you write and schedule your jobs so that they can handle things like being killed halfway, IIS processing your process is not that important.

The big problem is that IIS disables your site if it has no traffic. If you can save your site, just make sure that you have set the misfire policy correctly and that your tasks store the status data needed to pick up where they left off, you should be able to remove it.

0
source

If you are an agnostic language and do not mind writing your โ€œscript-invocationโ€ in your favorite Linux-supported language ...

One solution that worked for me very well:

  • Getting relatively cheap, stable Linux hosting (from reputable companies),
  • Creating a WCF service on your .NET-hosted platform that will contain the logic you want to run regularly (RESTfully or SOAP or XMLRPC ... depending on what suits you),
  • Handling calls through your Linux cron jobs written in your language of choice (I use PHP).

I work very well, as I said. No VPS costs, configurable and externally activated. I have one central place where my jobs are activated, time from 99 to 100% (there have never been any failures).

0
source

All Articles