I am implementing click tracking from different pages on our corporate intranet to add some much-needed popular links to popular sources ("the most popular links in your department in the last 24 hours", etc.)
I use jQuery.live () to bind to the mousedown event for all link elements on the page, filter the event, and then use a pseudo-ajax request with various data to the internal server before returning true to trigger the link action:
$("#contentarea a").live("mousedown", function(ev) { // // detect event, find closest link, process it here // $.ajax({ url: 'my-url', cache: false, dataType: 'jsonp', jsonp: 'cb', data: myDataString, success: function() { // silence is golden -- server does send success JSONP but // regardless of success or failure, we allow the user to continue } }); return true; // allow event to continue, user leaves the page. }
As you can guess from the above, I have several limitations:
- The internal tracking server is on a different subdomain from the calling page. I can not get around this. This is why I use JSONP (and GET), unlike AJAX itself with POST. I cannot implement AJAX proxies because web servers do not have outgoing network access for scripts.
- This is probably not relevant, but in the interest of full disclosure, the content and script are inside the "iframe" of the main content (and this will not change. Most likely, eventually move the event listener to the parent frame to track its links and all the child content, but step 1 makes it work correctly in the simplified case of “1 child window”). Parent and child are the same domain.
- The back-end is IIS / ASP (again, the limitation is do not ask!), So I can’t immediately unlock the background process or otherwise stop the answer, but continue to process as I could on the best platform
Despite this, for the most part, the system works - I click the links on the page and they appear in the database quite easily.
However, it is not reliable - for a large number of links, especially links outside the site, for which their purpose is set to "_top", they are not displayed. If the link is opened in a new tab or window, it registers OK.
I excluded script errors - it seems like either:
(a) the request never makes it on time; or
(b) the request does this, but ASP detects that the client disconnects shortly afterwards and, because it is a GET request, does not process it.
I suspect (b), since the server timeout is very fast, and many links are registered OK. If I turned on the warning popup after the event is triggered, or set the return value to false, the click is registered in order.
Any advice on how I can solve this (in the sense that I cannot change my limitations)? I cannot make the GET request synchronous, as this is not AJAX.
Q : Will it work better if I make a POST request for ASP? If (b) is the culprit, will he behave differently for POST and GET? If so, I can use the iframe / form hidden form to submit the data. however, I suspect this will be slower and more awkward, and may still not be in time. I could not listen if the request is completed because it is cross-domain.
Q Can I just add a delay to the script after the GET request is sent? How to do it in a single-threaded way? I need to return true from my function to ensure that the default event will ultimately fire, so I cannot use setTimeout (). Will a tight loop wait for “success” to shoot and set some variable work? I worry that this will freeze too much and the response will be slowed down. I assume jQuery delay () plugin is also a loop?
Or something else I did not think about being the culprit?
I don't need bulletproof reliability. If all links are equally attractive in 95% of cases, this is normal. However, now some links are 100% comprehensible, while others are not available - which is not going to shorten it for what I want to achieve.
Thanks in advance.