Point Averaging Times

Good Day,

I’m using an RS2+ and am looking to see what others are typically doing in the field for shooting and setting points.

I’m fairly familiar with the workflow for OPUS and times vs accuracy for that (4-5 hours puts you in a good spot, sitting the RS2+ for a full 8+ hour burn can eek out better accuracy, the good/bad/ugly of Rapid Static, etc.)

I’m usually using a single RS2+ for shooting points for GCP for drone aerial surveys, as well as establishing accurate base points for future use and RTK drone flights. My main question is averaging times using NTRIP for GCPs and base points. I have a decent understanding of error types, but am lacking on the actual field practice in this regard. Assuming I have a reasonably close CORS station or am using a VRS, is there a good ‘rule of thumb’ for shooting points with NTRIP as far as Fix-Averaging times? General expectations for accuracy gain/loss when averaging x time vs just instantaneous coordinate grab?

I always shoot a 2-minute averaged fix and loop back to stake them out on my way back to the truck.


What you are really measuring is time, positioning is being calculated. With this in mind theoretically the more time you station > the more satellites you listen > the more equations are making it in the matrix > better PDOP and accuracy. I usually work in RTK and station on the point at least 1 minute, and the time goes up in case of station points, base points, bad PDOPs.
I only know reason to worry about short time logs, what is the reason for more time to be a loss in quality?

1 Like

Thanks; what would you say is the top-end of your observation times for station/base points and why?

I don’t believe there would be potential error due to longer observation times, but am basically looking for the real-world point of diminishing return. For example, figures 8/9/10 in the following study on OPUS accuracy:


I didn’t know if there’s a strong case to average for hours with NTRIP/VRS, or if seconds/a few minutes gets you into the typical point of diminishing return with regards to accuracy.


Roughly every 25-30 minutes, multipath conditions will have changed enough (with satellite geometry) that it will effect your position.
With that info, in the grand scheme of things, and with decent receiver-conditions, it doesn’t really matter if you measure for 5 seconds or 2 minutes. What really matters is coming back to a given point after a minimum of 25-30 minutes, preferably more.
Do that 3 or 5 times, and you have an average that you can consider single-digit centimeter accurate, with potential for being milimeter precise (not necessarily accurate, but precise in relation to other points measured at the same time).

Of course, this can also be achieved by just logging for 3-5 hours using a decent baseline distance, but requires that you are set-up on the point for the duration continously.