So I’m using the current Navio2 image and newest Arducopter quad firmware.
I am doing a project where I have a Navio2 with an upwards facing PiCamera and I want to make the drone position itself underneath a hanging cable and oriented parallel to the line.
I am running an OpenCV script where I can find the angle and distance from left camera frame edge to the line found in the image (HoughLines) so I have the input data I need for a yaw and roll controller.
I was wondering if I could somehow get this angle and distance data straight from my OpenCV script (somehow store it locally in memory or something?) and read it into a custom flight mode in Arducopter, where I could code a controller that would take this data as input and then yaw and roll accordingly. I know that I could write something in Python and send yaw and roll commands through dronekit but I was thinking if it would be possible to go around the MAVLink communication and treat the OpenCV output data as just another “sensor” input that the Arducopter code can read and react to. I would think that this would be better for a custom attitude controller since there is not a delay in sending the data through a mavlink message.
All help is appreciated!