Is there a way to determine the time a client spends on a web page

Assuming I have an open source web server or a proxy server, I can improve, say, apache or squid.

Is there a way to determine the time each client spends on a web page?

HTTP, of course, is stateless, so it's not trivial, but maybe someone has an idea on how to approach this problem?

+6
apache proxy apache2 squid
source share
5 answers

Not without some kind of javascript it constantly hits your server on the client side and then checks when it will stop (but, of course, it is assumed that the user has javascript enabled). There are also various (ugly) ways to detect windows closed by javascript, but of course they will not always start. eg. browser crash.

I wonder why you want it that way. What should I do if a person looks at a web page for 3 seconds, is distracted by another tab / window, but leaves the page open for 2 hours? The answer you get is 2 hours, the answer you (possibly) want is 3 seconds.

+4
source share

With Apache or Squid, you are unlikely to be able to determine the time a user spends on your page.

But with extra sugar on your web page, you can:

It is free and has many features.

But you will also invite Google to look at the statistics of your site ... (but maybe this will help them decide if you want to buy you :-))

+5
source share

You can count the time between a page request when the next page is requested, however it would be correct if the user remained on this page all the time until he requested the next page. Even then, it can still be on the original page (for example, it opened a new one on a tab) and will only work if they go to another page.

The only way to find out for sure is to use Javascript for the ping server from the open page every ten seconds or so, just to say "I'm still reading!"

+2
source share

I really saw javascript analytics packages in which they not only tracked how long you were on the page, but often pinged the server, but also watched what was on the screen. By measuring the size of your browser window, as well as the scroll position of the document, they were able to accurately determine how long each item was on the screen. By tracking the location of the mouse, you can probably get a good idea of ​​what they are looking at. I can’t find the link right now, but here is the story. If you are really interested in what people are watching and how long you can do it. There is not as much limit as you can track.

It’s also just a thought. If you do not want to ping the server too much, you can store things buffered in memory and send them only to the server when you have enough data or is closed right in front of the page.

+2
source share

This type of metric was quite popular a few years ago before PCs became more powerful and tabbed browsers became popular and measuring them became more difficult. The standard way to do this in the past was to assume that people usually just load one page at a time and simply use server log data to determine the time between page views. Your standard analytics, such as Omniture and Urchin (now Google Analytics), calculate this.

You usually set tracking cookies to be able to identify a specific person / browser over time, but in the short term, you can simply use a combination of IP addresses / user agents.

So, basically you just glimpse the log data and count the delta between page views, how long the person has been on the page. You set some rules (or your analytics provider does it behind the curtain), like dropping / truncating time outside of some clipping (say 10 minutes), where you assume that the person has not actually read, but left the page open in a window / tab.

Is this data ideal? Obviously not. But you just need enough “good enough” data for statistical analysis and draw some conclusions.

It is still useful for longitudinal analysis (readers' habits over time) and qualitative comparisons between different pages of your site. (i.e. between two 700-word articles, if one has an average reading time twice as long as the other, then more people actually read the first article.) Of course, your site should be busy enough to have enough The number of data points for a statistically sound analysis after you throw away all the "bad" emission data.

Yes, you can use Javascript to send keep-alives to improve data. You can simply poll at specific intervals after document.onload or set up mouseover events on sections of your pages.

Another method is to use Javascript to add an onclick event for every <a href> that hits your server. You not only know when someone clicks on a link to remove them from your site, in fact, a complex analysis of the “hot spot” looks at the fact that if someone clicks the link of 6 paragraphs on a page, then they must be so far gone.

0
source share

All Articles