I am running Appstats on my Java-based HDR application on the Google App Engine. Right now I am somewhat puzzled by the timing value differences between the actual App Engine request log and the Appstats logs. There seems to be quite a drift between those times. To explain what I mean I brought some screenshots. Both of them are from the same request.
This one is from the App Engine request log:
And this one is for the same request from Appstats:
Here is a list of measurements I was able to extract:
Log Timestamp 10:10:33.156
Log "Total" 147ms
Log "ms" 147ms
Log "cpu_ms" 86ms
Appstats Timestamp 10:10:33.072
Appstats "Grand Total" 39ms
Appstats "real" 39ms
Appstats "RPC Total" 20ms
Appstats "api" 0ms
Appstats "overhead" 0ms
As you can see, there is quite a difference both in the request timestamp and the runtime:
Difference Log/AS Timestamps 84ms
Difference Log/AS Total 108ms (377%)
By the way: Although the request timestamps are not the same it still is the same request: I followed the link Appstats posted to the request log and did this test on a otherwise unused testing application so that I can make sure that the link is correct.
Why is Appstats telling me that my request took 39ms while the GAE request log tells my that the request took 3.8x (147ms) as long? Why does the same request have two different timestamps?
Thanksalot!
appstats is hooked into your code, there will be some time before your code (appstats) is invoked, and also after appstats exits. Remember appengine doesn't stream to the client, but all output is to a buffer of some sort, then passed out to appengine infrastructure for delivery.
The timeline of a request will be something like
It would be interesting to see how much 1 and 4 will be on average, and that may depend on runtime.
Oh and appstats itself will have some overhead which won't measured by appstats, including starting up, and cleaning up at the end, which contribute to the time in 1 and 2.