Any advantages in sending correction speed faster than 1Hz from base to rovers?
If you are in a vehicle doing auto-topo.
Yup like chasco says if you wanna go fast, really really fast.
but then the data rate required goes up, and will make you hit the bandwidth limitations of Lora, or make cellular correction more expensive.
I’m not sure I really understand; base corrections are really only for correcting changes those electronic delays in the earth’s atmosphere; are there really any tangible advantages beyond 1s of corrections?
The Hz value is an update and output rate. The speed of the radio is another factor which needs to be taken into account. We learned on Topcon gear which is our primary equipment which has rates up to 20Hz and 115k bps. Emlid has 10Hz and 18k bps so at it’s max settings it is still half the speed and receives a tenth of the data. With the Emlid, from our experience, 1hz is not a fast enough of an RTK correction for a 30mph/0.5sec auto-topo. When the surface generated on a roadway topo was ground-truthed with a rover we were not getting the 0.10ft tolerance that we needed to calculate vertical curves and tie-ins.
A rate of 1hz also wasn’t fast enough to supply sufficient corrections for GPS-controlled machines like motorgraders and bulldozers. The smoothness of the operation of the blades was much better with a 5Hz base and 10Hz machine. Considering the IMU’s calculate at 1000 measurements/sec the machines are capable at operating much faster than any rate we could get without overloading the GNSS hardware. That said we have never run more than 10Hz on the machine because the hardware up until now has not been able to handle it. This seems to be more the case on motorgraders than dozers which in normal scope go much slower. We haven’t even talked about scrapers which are much faster than either of those, but they have a 4-6in tolerance at final grade.
I’m confused because I consistently achieve better than 2cm horizontal and 4cm vertical accuracy when using 1Hz base station on vehicles moving 40mph. I have PPK’d datasets at both 1Hz and 10Hz, but I see no difference. More than likely with ground vehicles moving faster than about 5pmh, vehicle dynamics like suspension and control devices become the primary factor in the error calculation, not the GNSS antenna position itself.
Just so we’re clear here, this isn’t about the receiver update rate (number of observations per second that are corrected)… it’s about the update rate of the base station correction data going to the vehicle. We agree that 1Hz receiver update rate would be far too slow for many applications.
On a final note, we use 1Hz base station data to correct airborne systems (flying at 150mph), and we consistently achieve 5cm accuracy. We can use base station data down to 0.2Hz, and even then, we don’t see much of a drop in accuracy.
How are you checking your data? I know how we did it and it was quite obvious that a 5Hz/10Hz configuration was more accurate than a 1Hz/5Hz mounted on a vehicle travelling at 30mph. These test were performed on a paved roadway over a 10 mile stretch in left, center and right positions. That’s over 5,000 observations. It is very hard to believe that you are holding a tolerance of 2cm in a vehicle travelling at 40mph. What were your topo intervals?
If you use one correction indefinitely you will drift all over.
The quicker the correction refresh the closer your position will be while moving.
If you are moving 3m/s the correction you get, and the measurement the rover makes is already far behind you. Increasing refresh of correction and position gives the measured position more resolution at the cost of data.
For many applications this does not matter regardless of how fast or slow you are going. Flying a plane in open sky or sailing a boat across the sea you really do not need cm accuracy, or even meter accuracy.
Controlling a dozer blade or flying a drone in a tight area with collision hazards would require the faster refresh.
Glad the functionality for higher refresh is on base and rover, but 95% of the time the higher refresh is not critical to any application.
Chasco beat me to it lol
This isn’t ground observations; it’s photo center observations validated from aerotriangulation… about 1200 highly accurate observations per flight. I think there’s a fundamental disconnect here… we also use 10-20Hz collection on the rover, but the base station update rate required to achieve a fix on the airborne data is only 1Hz. That fix can be maintained 30s or more AFTER loss of base station correction data at which point it degrades to float and 3D fix.
Of course, but at what point are there diminishing returns? If I understand correctly, the industry standard is 1Hz because ionospheric delays aren’t changing significantly faster than that. Not every rover observation requires a base station observation because the ambiguities aren’t changing. It’s not like the rover can only correct the observations that the base station transmits… instead the base station is used as a reference observation to correct the current and any future observations if possible.
What we’re doing here with our RTK/PPK data is aerial surveying - making maps from photos that need a very accurate position in order to create accurate ground survey data. This requires very precise timing because every ms at 40mph is about 2cm. We have to interpolate between 10Hz data (so only one point every 2m), yet we still can achieve about 2cm accuracy as long as the vehicle is relatively stable on that flight path.
So you’re talking about RMSE on photo location correction? That’s not even close to what we are talking about in this thread so that is definitely a disconnect. Aerial mapping is a good bit of what I do including beta testing and through working with several photogrammetry engineers it is well known that the algorithms that that produce those figures take way more into account that just the location reported in the Exif data. Part of it being visual best fit so while you are getting good aerial data that means nothing for stakeout and ground-truthing which is what is required to provide certifiable confidence in the data. Do you use GCP’s and checkpoints? If not then your figures are theoretical at best when it comes to the accuracy of the reconstruction in comparison to reality.
Please don’t mistake me; this is ALL done with surveyed control and check points. 20+ check points for statistical significance. We have absolute confidence in our data, especially over our calibration site. My point is that photogrammetry is an excellent way to reverse-engineer a trajectory using bundles of redundant geometry. Those 1200 highly accurate observations are calculated from photogrammetry software, and the resultant error between that triangulated frame center estimate and the GNSS tags of the frame center have been on the order of 2cm RSME X/Y and 4cm RSME Z.
If using a faster base station log to correct the photo positions yielded lower overall errors, then yes, I would agree that a faster RTK at more than 1Hz would yield a solution with higher accuracy. Unfortunately my experimental results do NOT indicated that… faster base station corrections do NOT improve accuracy, even for vehicles at high speeds, and thus there is NO advantage. That’s the answer to the question above:
Just for clarification I didn’t mean to imply that your results weren’t good for your application, but that it is just different than the topic or at least the majority of the discussion. It is a practical use-case, but I assume you aren’t using two RS2’s as the base rover… What is your aerial collection hardware?
RS2 as the base and an M2 as the rover; that’s practically the same as 2 RS2’s; it’s the same chip in both. The only thing I’m doing differently is feeding RTK through the RS2’s wifi, into my laptop, into my drone’s mavlink data stream, through the flight controller, and into the M2 instead of using LoRa.
While they are the same chipset they do act differently in the field. I tried the M2 as a rover for about 2 weeks and quickly took my other RS2 back. The M2 works fine in the air, but it would not cut it on the road test. Obviously correction over WiFi and going to react differently than LoRa. Seems like we’ve got two completely different use-cases and two different results so now we need a third as control.
If I used a 5hz correction speed for sending corrections. I must also use 5hz raw data collection at both base & rover?
In order to send a 5Hz correction, you would have to be collecting at 5Hz, 10Hz or 20Hz. There’s no requirement on the rover, but the rover would be discarding data if it is receiving it faster than it is collecting.
If you are referring to logging then yes it will log at whatever the RTK Settings are. There’s no independent setting.