I use Firebug for Firefox to achieve same. Being able to plot the load
times for websites and for more specifically, the individual items of
content within the site, is enough to satisfy many people, especially
when you point out that the 3.5meg flash animation in the middle of the
page really does need several seconds to load, but that the download
time for it was reasonable.
The other tool i've used is iperf, this obviously requires you to have a
well connected endpoint with appropriate access (often requires root?)
but can again be used to verify that the bits of the network under your
sphere of influence are performing as expected.
On 28/02/12 09:53, Leith Bade wrote:
Have you looked at the timeline feature hidden in
It is designed to allow web developers to tune the loading speed of
their web pages but may provide a reliable way to measure request
latency and load time.
On 28 February 2012 07:06, Glen Eustace <geustace(a)godzone.net.nz
In my role at Massey University, I am often asked to investigate
complaints from our Help Desk of a 'the Internet is slow' nature.
We are finding this increasingly difficult to do in any
meaningful way. Massey is peered at WIX, PNIX and APE we have two
Internet peers and have multiple paths to them and they both have
multiple upstream providers.
Simply ping testing for latency/connectivity doesn't really
provide much of an insight, traceroute tells us where outbound
packets went and which hops are 'slow' but doesn't indicate the
return path. Most of our customers equate Website == Internet, so
the responsiveness of a destination seems to also be an important
I am assuming that most ISPs have Help Desk calls of a similar
nature. How does one substantiate or refute such complaints,
pro-actively identify 'slowness/congestion' ? Evidence collected
by 'tools' needs to be defensible when responding to our
Any pointers to how I can do this better would be appreciated.
NZNOG mailing list
NZNOG mailing list