I am about to take delivery of a Livox-Mid40 LiDAR unit to experiment with UAV LiDAR mapping. The LiDAR unit requires a timesync input from a gps (this will be used for ppk position processing). This may be an obvious question (for which I apologise in advance) but is there a simple way to output the time signal from the Reach?
I believe that Reach products will be ideally suited for this task. PPS is what you are asking for (Pulse Per Second). Let me refer you to this page in the Reach RS+ docs:
See the electrical specs section (download the PDF file) and that there exists a PPS (pulse per second) output on pin 5.
If you were thinking about using the Reach M+ instead, then go to this page in the docs:
See the connectors pinout section and that there are dual PPS outputs at C1 and C2.
Now, on page 9 of your Livox user manual, notice that the input type is RS485. Reach RS+ and M+ output the PPS signal as TTL-level UART. You will need a signal converter to interface them. Search using the two keywords “RS485 UART” and see a plethora of signal converters available; some as cheap as $10.
On page 16 of your Livox user manual is a description of the PPS synchronization logic. According to that description, I can confirm that Reach outputs a PPS a signal timing which is compatible with the Livox unit.
Note that a PPS signal only delivers synchronization, not time. A clock ticks (or increments) every second and you can also read the time from it. A metronome ticks (or waves) every second, but it can’t tell you the current time. PPS is like like the metronome.
Your data must be recorded with a timestamp, so there must also be a clock that needs to be set. Anyway, if you find that you need a data stream with the current time, then Reach could do that too.
I think your big concern will be translating the raw data from the lidar reference frame to ground referenced .las data. This requires very precise IMUs ($$$) and a decent amount of maths.
I am not familiar with the post processing of LiDAR data, but can’t the LiDAR position be georeferenced by the GNSS receiver, and then the orientation be calculated by matching multiple point clouds of the same subject area? Much the same as everyone is doing with their camera images?
This is a lot easier if doing ground surveys on a tripod (terrestrial laser scanning). Here, the location is fixed and all points in the cloud are from the same origin. One can put some reflectors out and tie the clouds together from different survey locations. On a uav, each origin is different (determined by the gnss), and the number of raw points at that location is much lower than the 100k points per second it records. So, you’d have to match up the handful (dependent on your cruising speed) of lidar points that you interpolate between gnss measurements - which won’t all overlap. In SfM you take images that overlap and use the geometry. Think of the picture as the “point cloud”. All points in the hypothetical cloud are from the same origin (ignoring rolling versus global shutter). This is quite different than I describe above. It’s possible to exploit the geometry from the images because of this.
MEMS IMU’s are getting cheaper though!
Thanks everyone for your input. My Livox is enroute from China so I will have a better idea of the physical setup once it arrives. I am aware of the challenges I face but that is the fun bit. I will need the data time-stamped in order to correct for the moving sensor (using GNSS and IMU data). As noted this is the complex bit but there are open source tools (and commercial ones at significant cost). In discussion with Livoxtech it appears they have a tie-up with DJI but I have asked them to provide support for a wider community. They suggest they are working on a solution to use the on-board flight controller IMU - whether this is sufficiently sensitive only time will tell. For the learning process that is also my plan (IMU data from the Pixhawk and GNSS ppk data from the reach [or preferably a new reach based upon the Ublox ZED-F9P). There are high quality INS systems that could be needed but these are getting relatively cheap now (n.b. I work in a University where our latest UAV-based LiDAR cost in excess to US$90000).
I’ll keep you posted.
This topic was automatically closed 100 days after the last reply. New replies are no longer allowed.