Get pageload time using command line - linux

I have a problem. I have a list of 10 URLs. I need to calculate page load time.

I am currently using curl to get page load time:

curl time www.growingcraft.com

Second method:

wget www.growingcraft.com

The problem is that this gives me the time taken to load the page, but:

  • Does this file also include javascript, css and image time?
  • That there is an external link to some images (as in this case). How to calculate the time taken to download these images?

Is there any other way to calculate page load time on Linux that would be more efficient / accurate?

+4
1

:

time wget -pq --no-cache --delete-after www.growingcraft.com 

-p (, ..).

-q .

.

+11

All Articles