Point Averaging Times

Good Day,

I’m using an RS2+ and am looking to see what others are typically doing in the field for shooting and setting points.

I’m fairly familiar with the workflow for OPUS and times vs accuracy for that (4-5 hours puts you in a good spot, sitting the RS2+ for a full 8+ hour burn can eek out better accuracy, the good/bad/ugly of Rapid Static, etc.)

I’m usually using a single RS2+ for shooting points for GCP for drone aerial surveys, as well as establishing accurate base points for future use and RTK drone flights. My main question is averaging times using NTRIP for GCPs and base points. I have a decent understanding of error types, but am lacking on the actual field practice in this regard. Assuming I have a reasonably close CORS station or am using a VRS, is there a good ‘rule of thumb’ for shooting points with NTRIP as far as Fix-Averaging times? General expectations for accuracy gain/loss when averaging x time vs just instantaneous coordinate grab?

I always shoot a 2-minute averaged fix and loop back to stake them out on my way back to the truck.

6 Likes

What you are really measuring is time, positioning is being calculated. With this in mind theoretically the more time you station > the more satellites you listen > the more equations are making it in the matrix > better PDOP and accuracy. I usually work in RTK and station on the point at least 1 minute, and the time goes up in case of station points, base points, bad PDOPs.
I only know reason to worry about short time logs, what is the reason for more time to be a loss in quality?

1 Like

Thanks; what would you say is the top-end of your observation times for station/base points and why?

I don’t believe there would be potential error due to longer observation times, but am basically looking for the real-world point of diminishing return. For example, figures 8/9/10 in the following study on OPUS accuracy:

https://www.researchgate.net/publication/320068882_Accuracy_of_Rapid_Static_Online_Positioning_User_Service_OPUS-RS_Revisited

I didn’t know if there’s a strong case to average for hours with NTRIP/VRS, or if seconds/a few minutes gets you into the typical point of diminishing return with regards to accuracy.

3 Likes

Roughly every 25-30 minutes, multipath conditions will have changed enough (with satellite geometry) that it will effect your position.
With that info, in the grand scheme of things, and with decent receiver-conditions, it doesn’t really matter if you measure for 5 seconds or 2 minutes. What really matters is coming back to a given point after a minimum of 25-30 minutes, preferably more.
Do that 3 or 5 times, and you have an average that you can consider single-digit centimeter accurate, with potential for being milimeter precise (not necessarily accurate, but precise in relation to other points measured at the same time).

Of course, this can also be achieved by just logging for 3-5 hours using a decent baseline distance, but requires that you are set-up on the point for the duration continously.

5 Likes

Hi @chasee,

Averaging the base will give you more data to use for computing your survey data. We can’t give one specific time because this will depend more on the environmental conditions, clear view, and no obstructions.

As the comments from other users here in the thread the average time could variate from environmental conditions or workflow. Our suggestion is to collect the points for at least 30 seconds of observation. This would help the receiver calculate its position with more data.

If you need absolute accuracy, you can obtain the precise position of the base by uploading your RINEX observation file in OPUS (if you are in the US) or using a PPP service like NRCAN 1.

5 Likes

My attitude is to collect enough data for post processing. It’s great you can log data @ 10Hz or faster (especially with high $$ receivers like our Javads), however you can’t post process using long baselines without enough data. Even with local base receivers onsite, enough data will be required for the post processing software.

In typical boundary surveys work, I’m not one to log any length of time for items like edge of roadways, driveways or non essential points, but logging for important items like boundary corners, tie distances and even GCP’S require substantial data to ensure the accuracy for the project and to verify by post processing.

There’s no set time limits for the user for observation of surveyed points, it’s basically trial and error. Experience and time on station will go a long way to guarantee the accuracy required for a project. Just be sure to have enough data in case someone questions your data accuracy (and also be sure to question theirs).

My thinking is always “prove me wrong”. If you’re questioning your own data, then you haven’t observed the point long enough.

5 Likes