RS2 accuracy verification

Hello, I use two RS2 units in a base/rover setup to map evidence related to car crashes. I am really only interested in relative accuracy. Absolute positioning on the earth is not important for my work. I work in RTK mode and have not much understanding or interest in post-processing. Since much of my work is related to litigation, I wanted a way to verify the accuracy of my setup/workflow. So, a while back, I took my units to a local survey calibration base line (Arsenal CBL near Denver, Colorado) and collected position data on the zero, 150m and 450m stations. The precise distances between the stations have reportedly been verified as part of the National Geodetic Survey to within 0.2 mm (see I collected 8 or 9 data points at each station and kind of walked in a circle away from and back toward the station between each point, trying to mimic how I collect data at a crash site. I also walked back and forth between the zero and 150m stations but drove to the 450m station with the rover sticking out the sunroof. I collected data with Reach View 3 using the (default?) WGS 84, EPSG: 4326 coordinate system (10 second collection time) and then used the NCAT website to convert the points to UTM coordinates. I averaged the readings for each station and calculated the station-to-station distances. The results, shown below, were somewhat surprising with 10.5 cm error at 150m and 32 cm at 450m. Even more surprising was that the measured distance between stations was consistently .07% shorter than the actual distance indicating some sort of systematic error.

Arsenal CBL UTM results

I tried converting the data to state plane coordinates and got results that were consistently .03% shorter than actual (see below). Interestingly, when I converted the data to XYZ (ECEF) coordinates the error dropped by an order of magnitude (.0025%), but still always on the short side. Obviously, XYZ coordinates are not very useful for mapping skid marks on a roadway.

Arsenal CBL SPC-XYZ results

As a follow-up test, yesterday I marked out a 99-foot base line using a tape measure on a pretty level roadway outside my office. I collected data with Reach View 3 using this coordinate system: NAD83/Colorado Central (ftUS) + NAVD88 height (ftUS) EPSG: 8721 + EPSG: 6360. I collected 10 points at each end of the base line, walking back and forth between each point (10 second collection time). On average, the distance measured was again .04% shorter than actual.

99-foot test results

Now, for my work, being off by ½ inch over 100 feet is not a huge problem. However, the systematic nature of the error (always short) and the potential to be off by one foot in 1,500 feet (32 cm on a 450m base line) raises some questions for me. I am not a professional surveyor so, if you see something wrong with my method, please let me know and I will be happy to try something different. Otherwise, I think the RS2 should be able to do better—am I wrong? Or is this just a coordinate system and/or coordinate conversion problem? Also, how do you verify the accuracy of your GNSS equipment on an ongoing basis? As far as I am aware, there is no “calibration” procedure, is there? A GNSS device either works or it doesn’t, right?

Thanks in advance for any insight you can offer. I would be happy to share the raw data if anyone is interested in taking a look. I really do want to learn and to better understand any issue with my methods or equipment.

By the way, precision or repeatability does not appear to be an issue. For the 20 points collected at the Arsenal CBL, the average deviation from the calculated mean station position (in 3D space) was only 9.5 mm with a maximum deviation of 16 mm. Seems pretty good to me.


Guy B.


I would suggest reading the NGS page

These baselines where established by terrestrial methodology i.e., calibrated tape and EDMI for EDMI and calibrated tape verification . You can use the baselines for GNSS receivers, however it’s best for reduction purposes to use the determined ellipsoid heights and reduce “up” or “down” to the published distances.

NOAA_TM_NOS_NGS_0010(1).pdf (542.6 KB)

Keep in mind you must follow the above instructions for care in level plumb of tribrachs, optical plummets accuracy, temperature and barometric conditiions and proper reduction of data if you’re ever to approach the accuracies of the baseline values as published.

I would not use typical bipod survey rods for the verifying of GNSS equipment on calibration lines. Proper equipment as survey tripods with accurate tribrachs/optical plummets or fixed height adjusted GNSS tripods are needed.


Hi Guy,
It sounds like the issue is likely that the distances provided are, ‘ground distances’ - definitely the case with your tape measure….and that the GNSS distances are grid distances.
Have a read about, ‘grid to ground’.
Look up the conversion factor eg. 0.999965 for your area and I think you’ll find this will improve your results.

The GNSS errors will be relatively consistent regardless of the distance ie. 20mm, so GNSS is less precise (compared to a total station) if you measure two points 1m apart than it is measuring two points 500m apart eg. 20mm/1m vs 20mm/500m.

There is no calibration per se.
However I can find a document for you that outlines best practice for collecting data to test and minimize error. It’s by the ISPRS, special publication 2.


Hi James, I think this is the paper you are referring too.
GPS and Grid to Ground

Also Mark Silver over at iGage has a good video on the topic.
Grid vs. Ground Distance Measurements
Why does my GPS measure distances short?

Regards, Mark


“Or is this just a coordinate system and/or coordinate conversion problem?”

Yes. (Almost for sure)

It’s hard to say without understanding exactly what you did, but with State Plane having less error than UTM there is some transformation error in what you did.

As others have pointed out look at grid versus ground distances.

If you went back and entered a “ball park” accurate (autonomous is fine) lat/long position for your base and then compared that position to what your rover got at the 150 and 450 meter points using, say, NGS’s INVERSE program to compute the length, you would likely be good to a cm or two. You can expect baseline lengths of 150 or 450 meters to have essentially the same error (1-2 cm).

Hi guys,

Thanks for the valuable discussion! The difference between the grid and ground distances indeed looks like a reason for this error.

@surveys4all, did you get a chance to check the grid to ground conversion factor? Has it helped?

1 Like

UTM scale issue seems to be a part of it, but not all I think.
UTM (400ppm) is about 4cm of scale error for every 100m, not all 10,5cm (at 150m)of yours.

We have a close to zero scale error (11ppm or 1mm for every 100m) with EUREF89 NTM (Norwegian Transverse Mercator) for large constructions and sites.
Do you have something similar?

1 degree zone width
The central meridian remains half a degree, 5º30 ‘, 6º30’, etc.
Scale 1.0000 in the central meridian (tangent)
1,000,000 (1 million) in false offset north (N = 1,000,000 at B = 58º North) and 100,000 (one hundred thousand) in false offset east
Same geoid and ellipsoid model as EUREF89 UTM
Maximum scale correction 11 ppm within natural zone

Thanks for all of the replies. Not being a professional surveyor I had no idea about gird vs. ground. The calibration base line was set up using a total station so the “actual” distances are ground distance. That Mark Silver video describes my problem exactly I think. I also found this paper which I think will be helpful:

Working with grid coordinates

I’ll do some research to understand the local scale factor but preliminarily it looks to be about 0.999747803 which reduces the errors by 75%. So I’m feeling better. I will likely have additional questions as I as work on this project and will post on this topic again when I do.

Thanks again. I really appreciate the help.

Guy B.


Yep think its a unit issue. Geographical system you normally use for navigation which are in degrees. Why I always use cartesian spatial reference systems like UTM for land base measurements since they will be in meters.

Hi Guy,

I’ll do some research to understand the local scale factor but preliminarily it looks to be about 0.999747803 which reduces the errors by 75%.

That’s good! Can’t wait to see how your project goes :blush:

1 Like

It’s more a projection-problem, than a unit problem. Would be sooo much easier if the Earth was flat :stuck_out_tongue:


This is the document that I was thinking of regarding GNSS verification

SG_Direction_09.pdf (2.5 MB)


Thank you James for the clarification. I found the paper a good read on best practices with GNSS.
Regards, Mark

1 Like

Doesn’t the NGS have a specific tool for reconciling surface to grid measurements?

It’s shown as the combined scale factor in the conversion report. Surface Grid distance= “combined scale factor” x measured ground distance.

Grid distance = 0.98987255 x 1,050.67’
Grid distance = 1,040.03’

The grid distances are used to determine true grid coordinates. From the example above, the grid surface is below the ground or orthometric surface.

The combined scale factor is also shown on all the NGS datasheets of the control station if there is vertical coordinates available.

You set the combined scale in the survey controller and then all measured ground distances are converted to grid distances. This is useful if needing grid coordinates for a point.

The NGS NOAA manual NOS NGS 5 has a lot of great information on understanding projection systems and has all the formulae for use if you want to program any of the processes. It has a lot of great graphics to under stand projection planes and surfaces.


Since its about the UTM scale error,I think it will fit right in here.
Hop it will shed some on the issue.

1 Like

This topic was automatically closed 100 days after the last reply. New replies are no longer allowed.