I’m using rover setup on RPi4 with 4GB and Navio2 and latest Emlid image from last September. I have Raspberry Pi Camera Module v.2. I use wireless connectivity where uplink is around 100Mbps.
I use raspivid and gstreamer to stream video to GCS, as well as maxproxy. I’ve tried different options for raspivid command with even 15 fps, but almost any movement on the screen is blurred and only still picture looks ok. Example of command:
raspivid -n -w 1280 -h 720 -b 1000000 -fps 25 -t 0 -o - | gst-launch-1.0 -v fdsrc ! h264parse ! rtph264pay config-interval=10 pt=96 ! udpsink host=10.10.10.10 port=9000
Does anyone have experience with streaming video on the same HW and getting decent quality so one can remotely operate a vehicle out of sight?
Thanks for the explanation!
Usually, video quality issues occur because of the connection issues. I’d suggest double-checking the strength of the connection that you use for obtaining the video.
It might also be a good idea to test your setup on the ground and close to GCS with a slow speed. This will allow to see how the speed and the distance affect the quality of the video.
I’m testing video in rover setup while everything standing still- just waving hand infront of the camera and comparing it with video on GCS. Currently downlink is 500 Mbps and uplink 100+Mbps, which shall be more than enough… (tested with speedtest on rover).
It looks one of the problem might be related to mavproxy.py and ardurover consumes sometimes upto 30% of cpu on RPI4, thus resulting in loss of udp packets (video). Another issue might be a need fr vpn, which also could be a source of the problem or combination of both.
Thanks for your patience!
The issue indeed most likely is in the capacity of Raspberry. It might not be enough for the demanding task of the correct video streaming. It’s also important to provide a stable high speed of transmission.
@alekso this happens because this proposal as it is mentioned also in navio2 docs cannot give you a good streaming performance… THe rpicamsrc element(instead of raspivid) in your gstreamer command (on your drone side) is the ideal way to send video… And of course using the SRT protocol will give a huge difference. I ve been searching on how to do this for such a long time and still looking for it.
When i used the proposed commands from navio2 docs I also had very poor performance with blurred images. It is practically impossible to use this approach for video transmission on a drone.
The best result I have managed to get so far without blurred images was to give the command on my drone as :
gst-launch-1.0 rpicamsrc bitrate=1000000 ! ‘video/x-raw,width=640,height=480,framerate=25/1,profile=baseline’ ! jpegenc ! rtpjpegpay ! udpsink host=ip of your GCS port=5600
and on my windows laptop as :
C:\gstreamer\1.0\x86_64\bin\gst-launch-1.0.exe -v udpsrc port=5600 ! application/x-rtp, encoding-name=JPEG, payload=26 ! rtpjitterbuffer ! rtpjpegdepay ! jpegdec ! autovideosink
OF course you will have to install rpicamsrc first on your drone raspberry by running these commands
git clone https://github.com/thaytan/gst-rpicamsrc
./autogen.sh --prefix=/usr --libdir=/usr/local/bin
sudo make install
For me this worked so far… Let me know how it goes…
see the video at https://www.srtalliance.org/ to understand the difference…
I have followed the install instruction and when i get to MAKE says --No targets specified and no makefile found. Any ideas on what I’m doing wrong?
This topic was automatically closed 100 days after the last reply. New replies are no longer allowed.