File "$VISP_WS/3rdparty/ARDroneSDK3/out/arsdk-native/staging-host/usr/lib/mavgen/pymavlink/generator/mavcrc.py", line 28, in accumulate_str
bytes.fromstring(buf)
AttributeError: 'array.array' object has no attribute 'fromstring'
you may edit $VISP_WS/3rdparty/ARDroneSDK3/out/arsdk-native/staging-host/usr/lib/mavgen/pymavlink/generator/mavcrc.py and modify accumulate_str() replacing
def accumulate_str(self, buf):
'''add in some more bytes'''
accum = self.crc
import array
bytes = array.array('B')
bytes.fromstring(buf)
self.accumulate(bytes)
by
def accumulate_str(self, buf):
'''add in some more bytes'''
accum = self.crc
import array
bytes_array = array.array('B')
try: # if buf is bytes
bytes_array.frombytes(buf)
except TypeError: # if buf is str
bytes_array.frombytes(buf.encode())
except AttributeError: # Python < 3.2
bytes_array.fromstring(buf)
self.accumulate(bytes_array)
3. Set ARSDK_DIR environment variable
In order for ViSP to find ARDroneSDK3, set ARSDK_DIR environment variable:
The following image shows the frames attached to the drone:
There is the drone control frame, also called end-effector frame, in which we can control the drone in velocity applying corresponding respectively to the 3 translational velocities along axis, and the rotational velocity along axis. The vpRobotBebop2 class allows to send these velocities. Note that the 6-dim velocity skew vector is named
There is the also the camera frame with axis in which we define the velocities skew vector .
The homogeneous transformation between the camera frame and the end-effector frame is named . This transformation is implemented as a vpHomogeneousMatrix.
In servoBebop2.cpp example, we use four visual features for the servoing in order to control the four drone dof . These visual features are:
Centered and normalized gravity center moment of the tag along camera and axis. This feature is implemented in vpFeatureMomentGravityCenterNormalized and used to center the tag in the image.
Normalized area moment of the tag . This feature implemented in vpFeatureMomentAreaNormalized is used to control the distance between the drone and the tag.
Horizontal vanishing point position corresponding to the intersection of the two lines passing through top and bottom tag edges. From the polar coordinates of this point, we use visual feature. This feature implemented in vpFeatureVanishingPoint is used to control the orientation of the drone along its vertical axis based on the tag orientation.
is the interaction matrix corresponding to the visual features . This matrix is updated in vpServo
is the velocity twist matrix build using . Implemented in vpVelocityTwistMatrix it allows to transform a velocity skew from end-effector frame into the camera frame:
the robot Jacobian that makes the link between the velocity skew and the control dof in the end-effector frame:
and are respectively current and desired visual feature vectors.
To make the relation between this controller description and the code, check the comments in servoBebop2.cpp.
Running the program
The next step is now to run the image-based visual servoing example implemented in servoBebop2.cpp.
Note
Before starting the program, the drone should be turned on and the computer connected to the drone WiFi, as shown in the following pictures :
On Ubuntu:
On Mac OSX :
Warning
CAUTION : It's is strongly recommended to use this program outside or in a large room with non-uniform flooring, as the drone uses a downward-facing camera to estimate its motion from optical flow. If the surface under the drone is uniform, its movements will be inaccurate and dangerous.
If you built ViSP with ffmpeg and Parrot ARSDK3 support, the corresponding binary is available in ${VISP_WS}/visp-build/example/servo-bebop2 folder.
$ cd ${VISP_WS}/visp-build/example/servo-bebop2
$ ./servoBebop2 --tag-size 0.14
Note
Passing the tag size (in meters) as a parameter is required.
On Mac OSX, you may need to allow servoBebop2 to accept incoming network connections :
Running the previous command should allow to get same results as the one presented in the video:
Run ./servoBebop2 --help to see which are the command line options available.
Adding option --ip allows you to specify the ip of the drone on the network (default is 192.168.42.1). This is useful if you changed your drone ip (see Changing Bebop 2 IP address), if you want to fly multiple drones at once, for instance.
Adding option --distance-to-tag 1.5 allows to specify the desired distance (in meters) to the tag for the drone servoing. Values between 0.5 and 2 are recommended (default is 1 meter).
Adding option --intrinsic ~/path-to-calibration-file/camera.xml allows you to specify the intrinsic camera calibration parameters. This file can be obtained by completing Tutorial: Camera intrinsic calibration. Without this option, default parameters that are enough for a trial will be used..
Adding option --hd_stream enables HD 720p stream resolution instead of default 480p. Increase range and accuracy of the tag detection, but increases latency and computation time.
Note
Camera calibration settings are different for the two resolutions.
Make sure that if you pass custom intrinsic camera parameters, they were obtained with the correct resolution.
Adding option --verbose or -v enables the display of information messages from the drone, and the velocity commands sent to the drone.
The program will first connect to the drone, start the video streaming and decoding, and then the drone will take off and hover until it detects one (and one only) 36h11 AprilTag in the image.
We then display the drone video stream with the visible features, as well as the error for each feature :
In this graph :
Xn corresponds to the error that allows to control the tag center of gravity along axis,
Yn corresponds to the error that allows to control the tag center of gravity along axis,
an corresponds to the error , used to regulate the distance between the drone and the tag along axis,
atan(1/rho) corresponds to the error related to vanishing point. This feature will make the drone move its orientation along axis to ensure that the two horizontal lines remain parallel.
Clicking on the drone view display will make the drone land, safely disconnect everything and quit the program.
Tips & Tricks
Changing Bebop 2 IP address
If you need to change the drone IP address, for flying multiple drones for instance, you can follow these steps :
Turn on your drone and connect to its WiFi network.
Press the drone on/off button 4 times.
Connect to the drone file system using telnet (if you haven't changed the drone IP yet, the default IP should be 192.168.42.1):
$ telnet 192.168.42.1
Note
If you get the message "Connection refused", you haven't properly pressed the on/off button 4 times.
If you get the message "Connection timed out", you haven't used the right IP. You can try with 192.168.43.1 .
Once you're connected to the drone file system, you need to get write access to the files. You can do so with :
$ mount –o remount,rw /
Warning
You now have permissions to move, edit or delete any file. Proceed at your own discretion, as you could irreversibly make your drone unusable !
Edit /sbin/broadcom_setup.sh :
$ cd sbin
$ vi broadcom_setup.sh
Note
If you don't know how to use VI text editor :
move the cursor using arrow keys,
edit the text using i and escape to cancel,
press : and enter wq to save and quit, or q! to quit without saving.
Edit line IFACE IP AP=”192.168.42.1” to IFACE IP AP=”192.168.x.1”, where x represents any number that you have not assigned to any other drone yet.
Save and exit the text editor.
Exit Bebop 2 file system by entering exit.
Restart your drone. It's IP should now be changed. You will have to adapt your programs accordingly.
Connecting multiple Bebop 2 drones to a single computer
If you want to control multiple drones using one single computer, you're going to need to change the drones ip, by following Changing Bebop 2 IP address.
Once every drone you want to use has a unique IP address, you need to connect your PC to each drone WiFi network. You can use multiple WiFi dongles and you PC WiFi card, if it has one.
For two drones, it should look like this (on Ubuntu) :
In ViSP programs that use the drone, you can then use option --ip to specify the IP of the drone to which you want to connect :
$ cd ${VISP_WS}/visp-build/example/servo-bebop2
$ ./keyboardControlBebop2.cpp --ip 192.168.42.1
and in another terminal :
$ cd ${VISP_WS}/visp-build/example/servo-bebop2
$ ./keyboardControlBebop2.cpp --ip 192.168.43.1
In your own programs, you can specify the IP in the constructor of vpRobotBebop2 class :
vpRobotBebop2 drone(false, true, "192.168.43.1"); // This creates the drone with low verbose level, settings reset and corresponding IP
Next tutorial
If needed, you can see Tutorial: Image frame grabbing corresponding section dedicated to Parrot Bebop 2 to get images of the calibration grid.
You can also calibrate your drone camera and generate an XML file usable in the servoing program (see Tutorial: Camera intrinsic calibration).
If you need more details about this program, check the comments in servoBebop2.cpp.
You can check example program keyboardControlBebop2.cpp if you want to see how to control a Bebop 2 drone with the keyboard.
You can also check vpRobotBebop2 to see the full documentation of the Bebop 2 ViSP class.
Finally, if you are more interested to do the same experiment with ROS framework, you can follow [How to do visual servoing with Parrot Bebop 2 drone using visp_ros tutorial.