RobotViper850

class RobotViper850(self, verbose: bool = true)

Bases: Viper850, Robot

Control of Irisa’s Viper S850 robot named Viper850.

Implementation of the vpRobot class in order to control Irisa’s Viper850 robot. This robot is an ADEPT six degrees of freedom arm. A firewire camera is mounted on the end-effector to allow eye-in-hand visual servoing. The control of this camera is achieved by the vp1394TwoGrabber class.

<unparsed image <doxmlparser.compound.docImageType object at 0x7f6df76ed000>>

The non modified Denavit-Hartenberg representation of the robot is given in the table below, where \(q_1^*, \ldots, q_6^*\) are the variable joint positions.

\[\begin{split}\begin{tabular}{|c|c|c|c|c|} \hline Joint & $a_i$ & $d_i$ & $\alpha_i$ & $\theta_i$ \\\hline 1 & $a_1$ & $d_1$ & $-\pi/2$ & $q_1^*$ \\2 & $a_2$ & 0 & 0 & $q_2^*$ \\3 & $a_3$ & 0 & $-\pi/2$ & $q_3^* - \pi$ \\4 & 0 & $d_4$ & $\pi/2$ & $q_4^*$ \\5 & 0 & 0 & $-\pi/2$ & $q_5^*$ \\6 & 0 & 0 & 0 & $q_6^*-\pi$ \\7 & 0 & $d_6$ & 0 & 0 \\\hline \end{tabular} \end{split}\]

In this modelization, different frames have to be considered.

  • \({\cal F}_f\) : the reference frame, also called world frame

  • \({\cal F}_w\) : the wrist frame located at the intersection of the last three rotations, with \(^f{\bf M}_w = ^0{\bf M}_6\)

  • \({\cal F}_e\) : the end-effector frame located at the interface of the two tool changers, with \(^f{\bf M}_e = 0{\bf M}_7\)

  • \({\cal F}_c\) : the camera or tool frame, with \(^f{\bf M}_c = ^f{\bf M}_e \; ^e{\bf M}_c\) where \(^e{\bf M}_c\) is the result of a calibration stage. We can also consider a custom tool vpViper850::TOOL_CUSTOM and set this during robot initialisation or using set_eMc() .

  • \({\cal F}_s\) : the force/torque sensor frame, with \(d7=0.0666\) .

This class allows to control the Viper650 arm robot in position and velocity:

  • in the joint space ( vpRobot::ARTICULAR_FRAME ),

  • in the fixed reference frame \({\cal F}_f\) ( vpRobot::REFERENCE_FRAME ),

  • in the camera or tool frame \({\cal F}_c\) ( vpRobot::CAMERA_FRAME ),

  • or in a mixed frame ( vpRobot::MIXT_FRAME ) where translations are expressed in the reference frame \({\cal F}_f\) and rotations in the camera or tool frame \({\cal F}_c\) .

End-effector frame ( vpRobot::END_EFFECTOR_FRAME ) is not implemented.

All the translations are expressed in meters for positions and m/s for the velocities. Rotations are expressed in radians for the positions, and rad/s for the rotation velocities.

The direct and inverse kinematics models are implemented in the vpViper850 class.

Warning

A Ctrl-C, a segmentation fault or other system errors are catched by this class to stop the robot.

To communicate with the robot, you may first create an instance of this class by calling the default constructor:

#include <visp3/robot/vpRobotViper850.h>

int main()
{
#ifdef VISP_HAVE_VIPER850
  vpRobotViper850 robot;
#endif
}

This initialize the robot kinematics with the \(^e{\bf M}_c\) extrinsic camera parameters obtained with a projection model without distortion. To set the robot kinematics with the \(^e{\bf M}_c\) transformation obtained with a camera perspective model including distortion you need to initialize the robot with:

#include <visp3/robot/vpRobotViper850.h>

#ifdef ENABLE_VISP_NAMESPACE
using namespace VISP_NAMESPACE_NAME;
#endif

int main()
{
#ifdef VISP_HAVE_VIPER850
  vpRobotViper850 robot;

  // Set the extrinsic camera parameters obtained with a perspective
  // projection model including a distortion parameter
  robot.init(vpViper850::TOOL_MARLIN_F033C_CAMERA, vpCameraParameters::perspectiveProjWithDistortion);
#endif
}

You can get the intrinsic camera parameters of an image acquired by the camera attached to the robot, with:

#include <visp3/core/vpCameraParameters.h>
#include <visp3/core/vpImage.h>
#include <visp3/robot/vpRobotViper850.h>
#include <visp3/sensor/vp1394TwoGrabber.h>

#ifdef ENABLE_VISP_NAMESPACE
using namespace VISP_NAMESPACE_NAME;
#endif

int main()
{
#if defined(VISP_HAVE_VIPER850) && defined(VISP_HAVE_DC1394)
  vpImage<unsigned char> I;
  vp1394TwoGrabber g;
  g.acquire(I);

  vpRobotViper850 robot;

  // ...

  vpCameraParameters cam;
  robot.getCameraParameters(cam, I);
  // In cam, you get the intrinsic parameters of the projection model
  // with distortion.
#endif
}

To control the robot in position, you may set the controller to position control and than send the position to reach in a specific frame like here in the joint space:

#include <visp3/core/vpColVector.h>
#include <visp3/core/vpMath.h>
#include <visp3/robot/vpRobotViper850.h>

#ifdef ENABLE_VISP_NAMESPACE
using namespace VISP_NAMESPACE_NAME;
#endif

int main()
{
#ifdef VISP_HAVE_VIPER850
  vpRobotViper850 robot;

  vpColVector q(6);
  // Set a joint position
  q[0] = vpMath::rad(10); // Joint 1 position, in rad
  q[1] = 0.2;             // Joint 2 position, in rad
  q[2] = 0.3;             // Joint 3 position, in rad
  q[3] = M_PI/8;          // Joint 4 position, in rad
  q[4] = M_PI/4;          // Joint 5 position, in rad
  q[5] = M_PI;            // Joint 6 position, in rad

  // Initialize the controller to position control
  robot.setRobotState(vpRobot::STATE_POSITION_CONTROL);

  // Moves the robot in the joint space
  robot.setPosition(vpRobot::ARTICULAR_FRAME, q);
#endif
}

The robot moves to the specified position with the default positioning velocity vpRobotViper850::m_defaultPositioningVelocity . The setPositioningVelocity() method allows to change the maximal velocity used to reach the desired position.

#include <visp3/core/vpColVector.h>
#include <visp3/core/vpMath.h>
#include <visp3/robot/vpRobotViper850.h>

#ifdef ENABLE_VISP_NAMESPACE
using namespace VISP_NAMESPACE_NAME;
#endif

int main()
{
#ifdef VISP_HAVE_VIPER850
  vpRobotViper850 robot;

  vpColVector q(6);
  // Set q[i] with i in [0:5]

  // Initialize the controller to position control
  robot.setRobotState(vpRobot::STATE_POSITION_CONTROL);

  // Set the max velocity to 40%
  robot.setPositioningVelocity(40);

  // Moves the robot in the joint space
  robot.setPosition(vpRobot::ARTICULAR_FRAME, q);
#endif
}

To control the robot in velocity, you may set the controller to velocity control and than send the velocities. To end the velocity control and stop the robot you have to set the controller to the stop state. Here is an example of a velocity control in the joint space:

#include <visp3/core/vpColVector.h>
#include <visp3/core/vpMath.h>
#include <visp3/robot/vpRobotViper850.h>

#ifdef ENABLE_VISP_NAMESPACE
using namespace VISP_NAMESPACE_NAME;
#endif

int main()
{
#ifdef VISP_HAVE_VIPER850
  vpRobotViper850 robot;

  vpColVector qvel(6);
  // Set a joint velocity
  qvel[0] = 0.1;             // Joint 1 velocity in rad/s
  qvel[1] = vpMath::rad(15); // Joint 2 velocity in rad/s
  qvel[2] = 0;               // Joint 3 velocity in rad/s
  qvel[3] = M_PI/8;          // Joint 4 velocity in rad/s
  qvel[4] = 0;               // Joint 5 velocity in rad/s
  qvel[5] = 0;               // Joint 6 velocity in rad/s

  // Initialize the controller to position control
  robot.setRobotState(vpRobot::STATE_VELOCITY_CONTROL);

  for ( ; ; ) {
    // Apply a velocity in the joint space
    robot.setVelocity(vpRobot::ARTICULAR_FRAME, qvel);

    // Compute new velocities qvel...
  }

  // Stop the robot
  robot.setRobotState(vpRobot::STATE_STOP);
#endif
}

It is also possible to specify the position of a custom tool cartesian frame. To this end this frame is to specify with respect of the end effector frame in \(^e {\bf M}_c\) transformation. This could be done by initializing the robot thanks to init(vpViper850::vpToolType, const vpHomogeneousMatrix &) or init(vpViper850::vpToolType, const std::string &) or using set_eMc() . The following example illustrates this usecase:

#include <visp3/core/vpHomogeneousMatrix.h>
#include <visp3/robot/vpRobotViper850.h>

#ifdef ENABLE_VISP_NAMESPACE
using namespace VISP_NAMESPACE_NAME;
#endif

int main()
{
#ifdef VISP_HAVE_VIPER850
  vpRobotViper850 robot;

  // Set the transformation between the end-effector frame
  // and the tool frame.
  vpHomogeneousMatrix eMc(0.001, 0.0, 0.1, 0.0, 0.0, M_PI/2);

  robot.init(vpViper850::TOOL_CUSTOM, eMc);
#endif
}

It is also possible to measure the robot current position with getPosition() method and the robot current velocities with the getVelocity() method.

For convenience, there is also the ability to read/write joint positions from a position file with readPosFile() and savePosFile() methods.

Methods

__init__

biasForceTorqueSensor

Bias the force/torque sensor.

closeGripper

Close the pneumatic two fingers gripper.

disableJoint6Limits

Warning

Each call to this function should be done carefully.

enableJoint6Limits

Enable the joint limits on axis number 6.

getControlMode

return:

The control mode indicating if the robot is in automatic, manual (usage of the dead man switch) or emergnecy stop mode.

getDisplacement

Get the robot displacement since the last call of this method.

getForceTorque

Overloaded function.

getMaxRotationVelocityJoint6

Get the maximal rotation velocity on joint 6 that is used only during velocity joint control.

getPosition

Overloaded function.

getPositioningVelocity

Get the maximal velocity percentage used for a position control.

getPowerState

Get the robot power state indication if power is on or off.

getTime

Returns the robot controller current time (in second) since last robot power on.

getVelocity

Overloaded function.

get_cMe

Get the geometric transformation \(^c{\bf M}_e\) between the camera frame and the end-effector frame.

get_cVe

Get the twist transformation \(^c{\bf V}_e\) from camera frame to end-effector frame.

get_eJe

Get the robot jacobian expressed in the end-effector frame.

get_fJe

Get the robot jacobian expressed in the robot reference frame also called fix frame.

init

Overloaded function.

move

Moves the robot to the joint position specified in the filename.

openGripper

Open the pneumatic two fingers gripper.

powerOff

Power off the robot.

powerOn

Power on the robot.

readPosFile

Read joint positions in a specific Viper850 position file.

savePosFile

Save joint (articular) positions in a specific Viper850 position file.

setMaxRotationVelocity

Overloaded function.

setMaxRotationVelocityJoint6

Set the maximal rotation velocity on joint 6 that is used only during velocity joint control.

setPosition

Overloaded function.

setPositioningVelocity

Set the maximal velocity percentage to use for a position control.

setRobotState

Overloaded function.

setVelocity

set_eMc

Overloaded function.

stopMotion

Stop the robot and set the robot state to vpRobot::STATE_STOP .

unbiasForceTorqueSensor

Unbias the force/torque sensor.

Inherited Methods

STATE_FORCE_TORQUE_CONTROL

TOOL_FRAME

CONST_GENERIC_CAMERA_NAME

TOOL_SCHUNK_GRIPPER_CAMERA

getMaxTranslationVelocity

Get the maximal translation velocity that can be sent to the robot during a velocity control.

getJointMin

Get minimal joint values.

get_fMe

Compute the forward kinematics (direct geometric model) as an homogeneous matrix \({^f}{\bf M}_e\) .

getNDof

Return robot degrees of freedom number.

getCoupl56

Return the coupling factor between join 5 and joint 6.

CONST_MARLIN_F033C_CAMERA_NAME

CONST_EMC_SCHUNK_GRIPPER_WITH_DISTORTION_FILENAME

CONST_EMC_MARLIN_F033C_WITH_DISTORTION_FILENAME

getCameraParametersProjType

Get the current camera model projection type.

CONST_EMC_GENERIC_WITHOUT_DISTORTION_FILENAME

TOOL_MARLIN_F033C_CAMERA

get_fMw

Compute the transformation between the fix frame and the wrist frame.

CONST_EMC_SCHUNK_GRIPPER_WITHOUT_DISTORTION_FILENAME

parseConfigFile

This function gets the robot constant parameters from a file.

getForwardKinematics

Compute the forward kinematics (direct geometric model) as an homogeneous matrix.

njoint

saturateVelocities

Saturate velocities.

ControlFrameType

Robot control frames.

get_fMc

Overloaded function.

TOOL_CUSTOM

getCameraParameters

Overloaded function.

setVerbose

CONST_EMC_GENERIC_WITH_DISTORTION_FILENAME

CONST_SCHUNK_GRIPPER_CAMERA_NAME

ToolType

List of possible tools that can be attached to the robot end-effector.

getJointMax

Get maximal joint values.

getToolType

Get the current tool type.

ARTICULAR_FRAME

CONST_EMC_MARLIN_F033C_WITHOUT_DISTORTION_FILENAME

get_wMe

Return the transformation between the wrist frame and the end-effector.

get_fJw

Get the robot jacobian \({^f}{\bf J}_w\) which express the velocity of the origin of the wrist frame in the robot reference frame also called fix frame.

CONST_EMC_PTGREY_FLEA2_WITH_DISTORTION_FILENAME

JOINT_STATE

REFERENCE_FRAME

getInverseKinematicsWrist

Compute the inverse kinematics (inverse geometric model).

RobotStateType

Robot control frames.

getMaxRotationVelocity

Get the maximal rotation velocity that can be sent to the robot during a velocity control.

TOOL_PTGREY_FLEA2_CAMERA

get_eMc

Get the geometric transformation between the end-effector frame and the camera frame.

STATE_STOP

STATE_POSITION_CONTROL

CONST_PTGREY_FLEA2_CAMERA_NAME

STATE_ACCELERATION_CONTROL

defaultTool

MIXT_FRAME

STATE_VELOCITY_CONTROL

setMaxTranslationVelocity

Set the maximal translation velocity that can be sent to the robot during a velocity control.

TOOL_GENERIC_CAMERA

CAMERA_FRAME

CONST_CAMERA_FILENAME

CONST_EMC_PTGREY_FLEA2_WITHOUT_DISTORTION_FILENAME

get_eMs

Get the geometric transformation between the end-effector frame and the force/torque sensor frame.

END_EFFECTOR_FRAME

getInverseKinematics

Compute the inverse kinematics (inverse geometric model).

getRobotState

Operators

__doc__

__init__

__module__

Attributes

ARTICULAR_FRAME

AUTO

CAMERA_FRAME

CONST_CAMERA_FILENAME

CONST_EMC_GENERIC_WITHOUT_DISTORTION_FILENAME

CONST_EMC_GENERIC_WITH_DISTORTION_FILENAME

CONST_EMC_MARLIN_F033C_WITHOUT_DISTORTION_FILENAME

CONST_EMC_MARLIN_F033C_WITH_DISTORTION_FILENAME

CONST_EMC_PTGREY_FLEA2_WITHOUT_DISTORTION_FILENAME

CONST_EMC_PTGREY_FLEA2_WITH_DISTORTION_FILENAME

CONST_EMC_SCHUNK_GRIPPER_WITHOUT_DISTORTION_FILENAME

CONST_EMC_SCHUNK_GRIPPER_WITH_DISTORTION_FILENAME

CONST_GENERIC_CAMERA_NAME

CONST_MARLIN_F033C_CAMERA_NAME

CONST_PTGREY_FLEA2_CAMERA_NAME

CONST_SCHUNK_GRIPPER_CAMERA_NAME

END_EFFECTOR_FRAME

ESTOP

JOINT_STATE

MANUAL

MIXT_FRAME

REFERENCE_FRAME

STATE_ACCELERATION_CONTROL

STATE_FORCE_TORQUE_CONTROL

STATE_POSITION_CONTROL

STATE_STOP

STATE_VELOCITY_CONTROL

TOOL_CUSTOM

TOOL_FRAME

TOOL_GENERIC_CAMERA

TOOL_MARLIN_F033C_CAMERA

TOOL_PTGREY_FLEA2_CAMERA

TOOL_SCHUNK_GRIPPER_CAMERA

__annotations__

defaultPositioningVelocity

defaultTool

njoint

class ControlFrameType(self, value: int)

Bases: pybind11_object

Robot control frames.

Values:

  • REFERENCE_FRAME: Corresponds to a fixed reference frame attached to the robot structure.

  • ARTICULAR_FRAME: Corresponds to the joint state. This value is deprecated. You should rather use vpRobot::JOINT_STATE .

  • JOINT_STATE: Corresponds to the joint state.

  • END_EFFECTOR_FRAME: Corresponds to robot end-effector frame.

  • CAMERA_FRAME: Corresponds to a frame attached to the camera mounted on the robot end-effector.

  • TOOL_FRAME: Corresponds to a frame attached to the tool (camera, gripper…) mounted on the robot end-effector. This value is equal to vpRobot::CAMERA_FRAME .

  • MIXT_FRAME: Corresponds to a “virtual” frame where translations are expressed in the reference frame, and rotations in the camera frame.

__and__(self, other: object) object
__eq__(self, other: object) bool
__ge__(self, other: object) bool
__getstate__(self) int
__gt__(self, other: object) bool
__hash__(self) int
__index__(self) int
__init__(self, value: int)
__int__(self) int
__invert__(self) object
__le__(self, other: object) bool
__lt__(self, other: object) bool
__ne__(self, other: object) bool
__or__(self, other: object) object
__rand__(self, other: object) object
__ror__(self, other: object) object
__rxor__(self, other: object) object
__setstate__(self, state: int) None
__xor__(self, other: object) object
property name : str
class ControlModeType(self, value: int)

Bases: pybind11_object

Control mode.

Values:

  • AUTO: Automatic control mode (default).

  • MANUAL: Manual control mode activated when the dead man switch is in use.

  • ESTOP: Emergency stop activated.

__and__(self, other: object) object
__eq__(self, other: object) bool
__ge__(self, other: object) bool
__getstate__(self) int
__gt__(self, other: object) bool
__hash__(self) int
__index__(self) int
__init__(self, value: int)
__int__(self) int
__invert__(self) object
__le__(self, other: object) bool
__lt__(self, other: object) bool
__ne__(self, other: object) bool
__or__(self, other: object) object
__rand__(self, other: object) object
__ror__(self, other: object) object
__rxor__(self, other: object) object
__setstate__(self, state: int) None
__xor__(self, other: object) object
property name : str
class RobotStateType(self, value: int)

Bases: pybind11_object

Robot control frames.

Values:

  • REFERENCE_FRAME: Corresponds to a fixed reference frame attached to the robot structure.

  • ARTICULAR_FRAME: Corresponds to the joint state. This value is deprecated. You should rather use vpRobot::JOINT_STATE .

  • JOINT_STATE: Corresponds to the joint state.

  • END_EFFECTOR_FRAME: Corresponds to robot end-effector frame.

  • CAMERA_FRAME: Corresponds to a frame attached to the camera mounted on the robot end-effector.

  • TOOL_FRAME: Corresponds to a frame attached to the tool (camera, gripper…) mounted on the robot end-effector. This value is equal to vpRobot::CAMERA_FRAME .

  • MIXT_FRAME: Corresponds to a “virtual” frame where translations are expressed in the reference frame, and rotations in the camera frame.

__and__(self, other: object) object
__eq__(self, other: object) bool
__ge__(self, other: object) bool
__getstate__(self) int
__gt__(self, other: object) bool
__hash__(self) int
__index__(self) int
__init__(self, value: int)
__int__(self) int
__invert__(self) object
__le__(self, other: object) bool
__lt__(self, other: object) bool
__ne__(self, other: object) bool
__or__(self, other: object) object
__rand__(self, other: object) object
__ror__(self, other: object) object
__rxor__(self, other: object) object
__setstate__(self, state: int) None
__xor__(self, other: object) object
property name : str
class ToolType(self, value: int)

Bases: pybind11_object

List of possible tools that can be attached to the robot end-effector.

Values:

  • TOOL_MARLIN_F033C_CAMERA: Marlin F033C camera.

  • TOOL_PTGREY_FLEA2_CAMERA: Point Grey Flea 2 camera.

  • TOOL_SCHUNK_GRIPPER_CAMERA: Camera attached to the Schunk gripper.

  • TOOL_GENERIC_CAMERA: A generic camera.

  • TOOL_CUSTOM: A user defined tool.

__and__(self, other: object) object
__eq__(self, other: object) bool
__ge__(self, other: object) bool
__getstate__(self) int
__gt__(self, other: object) bool
__hash__(self) int
__index__(self) int
__init__(self, value: int)
__int__(self) int
__invert__(self) object
__le__(self, other: object) bool
__lt__(self, other: object) bool
__ne__(self, other: object) bool
__or__(self, other: object) object
__rand__(self, other: object) object
__ror__(self, other: object) object
__rxor__(self, other: object) object
__setstate__(self, state: int) None
__xor__(self, other: object) object
property name : str
__init__(self, verbose: bool = true)
biasForceTorqueSensor(self) None

Bias the force/torque sensor.

Note

See unbiasForceTorqueSensor() , getForceTorque()

closeGripper(self) None

Close the pneumatic two fingers gripper.

Note

See openGripper()

disableJoint6Limits(self) None

Warning

Each call to this function should be done carefully.

Disable the joint limits on axis number 6. When joint 6 is outside the limits, a call to this function allows to bring the robot to a position inside the limits. Don’t forget then to call enableJoint6Limits() to reduce the working space for joint 6.

Note

See enableJoint6Limits()

enableJoint6Limits(self) None

Enable the joint limits on axis number 6. This is the default.

Note

See disbleJoint6Limits()

getCameraParameters(*args, **kwargs)

Overloaded function.

  1. getCameraParameters(self: visp._visp.robot.Viper850, cam: visp._visp.core.CameraParameters, image_width: int, image_height: int) -> None

Get the current intrinsic camera parameters obtained by calibration.

Warning

This method needs XML library to parse the file defined in vpViper850::CONST_CAMERA_FILENAME and containing the camera parameters.

Warning

Third method needs also an access to the files containing the camera parameters in XML format. This access is available if VISP_HAVE_VIPER850_DATA macro is defined in include/visp3/core/vpConfig.h file.

  • If VISP_HAVE_VIPER850_DATA macro is defined, this method gets the camera parameters from const_camera_Viper850.xml config file.

  • If this macro is not defined, this method set the camera parameters to default one.

The code below shows how to get the camera parameters of the camera attached to the robot.

#include <visp3/core/vpImage.h>
#include <visp3/robot/vpRobotViper850.h>
#include <visp3/robot/vpViper850.h>
#include <visp3/sensor/vp1394TwoGrabber.h>

#ifdef ENABLE_VISP_NAMESPACE
using namespace VISP_NAMESPACE_NAME;
#endif

int main()
{
  vpImage<unsigned char> I(480, 640);

#ifdef VISP_HAVE_DC1394
  vp1394TwoGrabber g;

  // Acquire an image to update image structure
  g.acquire(I) ;
#endif

#ifdef VISP_HAVE_VIPER850
  vpRobotViper850 robot;
#else
  vpViper850 robot;
#endif

  vpCameraParameters cam ;
  // Get the intrinsic camera parameters depending on the image size
  // Camera parameters are read from
  // /udd/fspindle/robot/Viper850/current/include/const_camera_Viper850.xml
  // if VISP_HAVE_VIPER850_DATA macro is defined
  // in vpConfig.h file
  try {
    robot.getCameraParameters (cam, I.getWidth(), I.getHeight());
  }
  catch(...) {
    std::cout << "Cannot get camera parameters for image: " << I.getWidth() << " x " << I.getHeight() << std::endl;
  }
  std::cout << "Camera parameters: " << cam << std::endl;
}
Parameters:
cam

In output, camera parameters to fill.

image_width

Image width used to compute camera calibration.

image_height

Image height used to compute camera calibration.

  1. getCameraParameters(self: visp._visp.robot.Viper850, cam: visp._visp.core.CameraParameters, I: visp._visp.core.ImageGray) -> None

Get the current intrinsic camera parameters obtained by calibration.

Warning

This method needs XML library to parse the file defined in vpViper850::CONST_CAMERA_FILENAME and containing the camera parameters.

Warning

Third method needs also an access to the files containing the camera parameters in XML format. This access is available if VISP_HAVE_VIPER850_DATA macro is defined in include/visp3/core/vpConfig.h file.

  • If VISP_HAVE_VIPER850_DATA macro is defined, this method gets the camera parameters from const_camera_Viper850.xml config file.

  • If these two macros are not defined, this method set the camera parameters to default one.

#include <visp3/core/vpImage.h>
#include <visp3/robot/vpRobotViper850.h>
#include <visp3/robot/vpViper850.h>
#include <visp3/sensor/vp1394TwoGrabber.h>

int main()
{
  vpImage<unsigned char> I(480, 640);

#ifdef VISP_HAVE_DC1394
  vp1394TwoGrabber g;

  // Acquire an image to update image structure
  g.acquire(I) ;
#endif

#ifdef VISP_HAVE_VIPER850
  vpRobotViper850 robot;
#else
  vpViper850 robot;
#endif

  vpCameraParameters cam ;
  // Get the intrinsic camera parameters depending on the image size
  try {
    robot.getCameraParameters (cam, I);
  }
  catch(...) {
    std::cout << "Cannot get camera parameters for image: " << I.getWidth() << " x " << I.getHeight() << std::endl;
  }
  std::cout << "Camera parameters: " << cam << std::endl;
}
Parameters:
cam

In output, camera parameters to fill.

I

A B&W image send by the current camera in use.

  1. getCameraParameters(self: visp._visp.robot.Viper850, cam: visp._visp.core.CameraParameters, I: visp._visp.core.ImageRGBa) -> None

Get the current intrinsic camera parameters obtained by calibration.

Warning

This method needs XML library to parse the file defined in vpViper850::CONST_CAMERA_FILENAME and containing the camera parameters.

Warning

Third method needs also an access to the files containing the camera parameters in XML format. This access is available if VISP_HAVE_VIPER850_DATA macro is defined in include/visp3/core/vpConfig.h file.

  • If VISP_HAVE_VIPER850_DATA macro is defined, this method gets the camera parameters from const_camera_Viper850.xml config file.

  • If these two macros are not defined, this method set the camera parameters to default one.

#include <visp3/core/vpImage.h>
#include <visp3/robot/vpRobotViper850.h>
#include <visp3/robot/vpViper850.h>
#include <visp3/sensor/vp1394TwoGrabber.h>

#ifdef ENABLE_VISP_NAMESPACE
using namespace VISP_NAMESPACE_NAME;
#endif

int main()
{
  vpImage<vpRGBa> I(480, 640);

#ifdef VISP_HAVE_DC1394
  vp1394TwoGrabber g;

  // Acquire an image to update image structure
  g.acquire(I) ;
#endif

#ifdef VISP_HAVE_VIPER850
  vpRobotViper850 robot;
#else
  vpViper850 robot;
#endif

  vpCameraParameters cam ;
  // Get the intrinsic camera parameters depending on the image size
  try {
    robot.getCameraParameters (cam, I);
  }
  catch(...) {
    std::cout << "Cannot get camera parameters for image: " << I.getWidth() << " x " << I.getHeight() << std::endl;
  }
  std::cout << "Camera parameters: " << cam << std::endl;
}
Parameters:
cam

In output, camera parameters to fill.

I

A color image send by the current camera in use.

getCameraParametersProjType(self) visp._visp.core.CameraParameters.CameraParametersProjType

Get the current camera model projection type.

getControlMode(self) visp._visp.robot.RobotViper850.ControlModeType
Returns:

The control mode indicating if the robot is in automatic, manual (usage of the dead man switch) or emergnecy stop mode.

getCoupl56(self) float

Return the coupling factor between join 5 and joint 6.

This factor should be only useful when motor positions are considered. Since the positions returned by the robot are joint positions which takes into account the coupling factor, it has not to be considered in the modelization of the robot.

getDisplacement(self, frame: visp._visp.robot.Robot.ControlFrameType, displacement: visp._visp.core.ColVector) None

Get the robot displacement since the last call of this method.

Warning

This functionnality is not implemented for the moment in the cartesian space. It is only available in the joint space ( vpRobot::ARTICULAR_FRAME ).

In camera or reference frame, rotations are expressed with the Euler Rxyz representation.

Parameters:
frame: visp._visp.robot.Robot.ControlFrameType

The frame in which the measured displacement is expressed.

displacement: visp._visp.core.ColVector

<unparsed orderedlist <doxmlparser.compound.docListType object at 0x7f6df79314e0>>

getForceTorque(*args, **kwargs)

Overloaded function.

  1. getForceTorque(self: visp._visp.robot.RobotViper850, H: visp._visp.core.ColVector) -> None

Get the rough force/torque sensor measures.

The code below shows how to get the force/torque measures after a sensor bias.

#include <visp3/core/vpColVector.h>
#include <visp3/core/vpTime.h>
#include <visp3/robot/vpRobotViper850.h>

#ifdef ENABLE_VISP_NAMESPACE
using namespace VISP_NAMESPACE_NAME;
#endif

int main()
{
#ifdef VISP_HAVE_VIPER850
  vpColVector  H; // force/torque measures [Fx, Fy, Fz, Tx, Ty, Tz]

  vpRobotViper850 robot;

  // Bias the force/torque sensor
  robot.biasForceTorqueSensor();

  for (unsigned int i=0; i< 10; i++) {
    robot.getForceTorque(H) ;
    std::cout << "Measured force/torque: " << H.t() << std::endl;
    vpTime::wait(5);
  }
#endif
}

Note

See biasForceTorqueSensor() , unbiasForceTorqueSensor()

Parameters:
H

[Fx, Fy, Fz, Tx, Ty, Tz] Forces/torques measured by the sensor.

  1. getForceTorque(self: visp._visp.robot.RobotViper850) -> visp._visp.core.ColVector

Get the rough force/torque sensor measures.

The code below shows how to get the force/torque measures after a sensor bias.

#include <visp3/core/vpColVector.h>
#include <visp3/core/vpTime.h>
#include <visp3/robot/vpRobotViper850.h>

#ifdef ENABLE_VISP_NAMESPACE
using namespace VISP_NAMESPACE_NAME;
#endif

int main()
{
#ifdef VISP_HAVE_VIPER850
  vpRobotViper850 robot;

  // Bias the force/torque sensor
  robot.biasForceTorqueSensor();

  for (unsigned int i=0; i< 10; i++) {
    vpColVector H = robot.getForceTorque(); // force/torque measures [Fx, Fy, Fz, Tx, Ty, Tz]
    std::cout << "Measured force/torque: " << H.t() << std::endl;
    vpTime::wait(5);
  }
#endif
}

Note

See biasForceTorqueSensor() , unbiasForceTorqueSensor()

Returns:

[Fx, Fy, Fz, Tx, Ty, Tz] Forces/torques measured by the sensor.

getForwardKinematics(self, q: visp._visp.core.ColVector) visp._visp.core.HomogeneousMatrix

Compute the forward kinematics (direct geometric model) as an homogeneous matrix.

By forward kinematics we mean here the position and the orientation of the camera relative to the base frame given the six joint positions.

This method is the same than get_fMc(const vpColVector & q).

Note

See get_fMc(const vpColVector & q)

Note

See getInverseKinematics()

Parameters:
q: visp._visp.core.ColVector

A six dimension vector corresponding to the robot joint positions expressed in radians.

Returns:

The homogeneous matrix \(^f{\bf M}_c\) corresponding to the direct geometric model which expresses the transformation between the base frame and the camera frame.

getInverseKinematics(self, fMc: visp._visp.core.HomogeneousMatrix, q: visp._visp.core.ColVector, verbose: bool = false) int

Compute the inverse kinematics (inverse geometric model).

By inverse kinematics we mean here the six joint values given the position and the orientation of the camera frame relative to the base frame.

The code below shows how to compute the inverse geometric model:

vpColVector q1(6), q2(6);
vpHomogeneousMatrix fMc;

vpViper robot;

// Get the current joint position of the robot
robot.getPosition(vpRobot::ARTICULAR_FRAME, q1);

// Compute the pose of the camera in the reference frame using the
// direct geometric model
fMc = robot.getForwardKinematics(q1);
// this is similar to  fMc = robot.get_fMc(q1);
// or robot.get_fMc(q1, fMc);

// Compute the inverse geometric model
int nbsol; // number of solutions (0, 1 to 8) of the inverse geometric model
// get the nearest solution to the current joint position
nbsol = robot.getInverseKinematics(fMc, q1);

if (nbsol == 0)
  std::cout << "No solution of the inverse geometric model " << std::endl;
else if (nbsol >= 1)
  std::cout << "Nearest solution: " << q1 << std::endl;

Note

See getForwardKinematics() , getInverseKinematicsWrist

Parameters:
fMc: visp._visp.core.HomogeneousMatrix

Homogeneous matrix \(^f{\bf M}_c\) describing the transformation from base frame to the camera frame.

q: visp._visp.core.ColVector

In input, a six dimension vector corresponding to the current joint positions expressed in radians. In output, the solution of the inverse kinematics, ie. the joint positions corresponding to \(^f{\bf M}_c\) .

verbose: bool = false

Add extra printings.

Returns:

Add printings if no solution was found.The number of solutions (1 to 8) of the inverse geometric model. O, if no solution can be found.

getInverseKinematicsWrist(self, fMw: visp._visp.core.HomogeneousMatrix, q: visp._visp.core.ColVector, verbose: bool = false) int

Compute the inverse kinematics (inverse geometric model).

By inverse kinematics we mean here the six joint values given the position and the orientation of the camera frame relative to the base frame.

The code below shows how to compute the inverse geometric model:

vpColVector q1(6), q2(6);
vpHomogeneousMatrix fMw;

vpViper robot;

// Get the current joint position of the robot
robot.getPosition(vpRobot::ARTICULAR_FRAME, q1);

// Compute the pose of the wrist in the reference frame using the
// direct geometric model
robot.get_fMw(q1, fMw);

// Compute the inverse geometric model
int nbsol; // number of solutions (0, 1 to 8) of the inverse geometric model
// get the nearest solution to the current joint position
nbsol = robot.getInverseKinematicsWrist(fMw, q1);

if (nbsol == 0)
  std::cout << "No solution of the inverse geometric model " << std::endl;
else if (nbsol >= 1)
  std::cout << "Nearest solution: " << q1 << std::endl;

Note

See getForwardKinematics() , getInverseKinematics()

Parameters:
fMw: visp._visp.core.HomogeneousMatrix

Homogeneous matrix \(^f{\bf M}_w\) describing the transformation from base frame to the wrist frame.

q: visp._visp.core.ColVector

In input, a six dimension vector corresponding to the current joint positions expressed in radians. In output, the solution of the inverse kinematics, ie. the joint positions corresponding to \(^f{\bf M}_w\) .

verbose: bool = false

Add extra printings.

Returns:

Add printings if no solution was found.The number of solutions (1 to 8) of the inverse geometric model. O, if no solution can be found.

getJointMax(self) visp._visp.core.ColVector

Get maximal joint values.

Returns:

A 6-dimension vector that contains the maximal joint values for the 6 dof. All the values are expressed in radians.

getJointMin(self) visp._visp.core.ColVector

Get minimal joint values.

Returns:

A 6-dimension vector that contains the minimal joint values for the 6 dof. All the values are expressed in radians.

getMaxRotationVelocity(self) float

Get the maximal rotation velocity that can be sent to the robot during a velocity control.

Returns:

Maximum rotation velocity expressed in rad/s.

getMaxRotationVelocityJoint6(self) float

Get the maximal rotation velocity on joint 6 that is used only during velocity joint control.

Returns:

Maximum rotation velocity on joint 6 expressed in rad/s.

getMaxTranslationVelocity(self) float

Get the maximal translation velocity that can be sent to the robot during a velocity control.

Returns:

Maximum translational velocity expressed in m/s.

getNDof(self) int

Return robot degrees of freedom number.

getPosition(*args, **kwargs)

Overloaded function.

  1. getPosition(self: visp._visp.robot.RobotViper850, frame: visp._visp.robot.Robot.ControlFrameType, position: visp._visp.core.ColVector) -> None

  2. getPosition(self: visp._visp.robot.RobotViper850, frame: visp._visp.robot.Robot.ControlFrameType, position: visp._visp.core.ColVector, timestamp: float) -> float

  3. getPosition(self: visp._visp.robot.RobotViper850, frame: visp._visp.robot.Robot.ControlFrameType, position: visp._visp.core.PoseVector) -> None

  4. getPosition(self: visp._visp.robot.RobotViper850, frame: visp._visp.robot.Robot.ControlFrameType, position: visp._visp.core.PoseVector, timestamp: float) -> float

  5. getPosition(self: visp._visp.robot.Robot, frame: visp._visp.robot.Robot.ControlFrameType) -> visp._visp.core.ColVector

getPositioningVelocity(self) float

Get the maximal velocity percentage used for a position control.

Note

See setPositioningVelocity()

getPowerState(self) bool

Get the robot power state indication if power is on or off.

Note

See powerOn() , powerOff()

Returns:

true if power is on. false if power is off.

getRobotState(self) visp._visp.robot.Robot.RobotStateType
getTime(self) float

Returns the robot controller current time (in second) since last robot power on.

getToolType(self) visp._visp.robot.Viper850.ToolType

Get the current tool type.

getVelocity(*args, **kwargs)

Overloaded function.

  1. getVelocity(self: visp._visp.robot.RobotViper850, frame: visp._visp.robot.Robot.ControlFrameType, velocity: visp._visp.core.ColVector) -> None

  2. getVelocity(self: visp._visp.robot.RobotViper850, frame: visp._visp.robot.Robot.ControlFrameType, velocity: visp._visp.core.ColVector, timestamp: float) -> float

  3. getVelocity(self: visp._visp.robot.RobotViper850, frame: visp._visp.robot.Robot.ControlFrameType) -> visp._visp.core.ColVector

  4. getVelocity(self: visp._visp.robot.RobotViper850, frame: visp._visp.robot.Robot.ControlFrameType, timestamp: float) -> tuple[visp._visp.core.ColVector, float]

get_cMe(self, cMe: visp._visp.core.HomogeneousMatrix) None

Get the geometric transformation \(^c{\bf M}_e\) between the camera frame and the end-effector frame. This transformation is constant and correspond to the extrinsic camera parameters estimated by calibration.

Parameters:
cMe: visp._visp.core.HomogeneousMatrix

Transformation between the camera frame and the end-effector frame.

get_cVe(self, cVe: visp._visp.core.VelocityTwistMatrix) None

Get the twist transformation \(^c{\bf V}_e\) from camera frame to end-effector frame. This transformation allows to compute a velocity expressed in the end-effector frame into the camera frame.

Parameters:
cVe: visp._visp.core.VelocityTwistMatrix

Twist transformation.

get_eJe(self, eJe: visp._visp.core.Matrix) None

Get the robot jacobian expressed in the end-effector frame.

To compute \(^e{\bf J}_e\) , we communicate with the low level controller to get the joint position of the robot.

Parameters:
eJe: visp._visp.core.Matrix

Robot jacobian \(^e{\bf J}_e\) expressed in the end-effector frame.

get_eMc(self, eMc: visp._visp.core.HomogeneousMatrix) None

Get the geometric transformation between the end-effector frame and the camera frame. This transformation is constant and correspond to the extrinsic camera parameters estimated by calibration.

Note

See get_cMe()

get_eMs(self, eMs: visp._visp.core.HomogeneousMatrix) None

Get the geometric transformation between the end-effector frame and the force/torque sensor frame. This transformation is constant.

Parameters:
eMs: visp._visp.core.HomogeneousMatrix

Transformation between the the end-effector frame and the force/torque sensor frame.

get_fJe(self, fJe: visp._visp.core.Matrix) None

Get the robot jacobian expressed in the robot reference frame also called fix frame.

To compute \(^f{\bf J}_e\) , we communicate with the low level controller to get the joint position of the robot.

Parameters:
fJe: visp._visp.core.Matrix

Robot jacobian \(^f{\bf J}_e\) expressed in the reference frame.

get_fJw(self, q: visp._visp.core.ColVector, fJw: visp._visp.core.Matrix) None

Get the robot jacobian \({^f}{\bf J}_w\) which express the velocity of the origin of the wrist frame in the robot reference frame also called fix frame.

\[\begin{split}{^f}J_w = \left(\begin{array}{cccccc} J_{11} & J_{12} & J_{13} & 0 & 0 & 0 \\J_{21} & J_{22} & J_{23} & 0 & 0 & 0 \\0 & J_{32} & J_{33} & 0 & 0 & 0 \\0 & -s1 & -s1 & c1s23 & J_{45} & J_{46} \\0 & c1 & c1 & s1s23 & J_{55} & J_{56} \\1 & 0 & 0 & c23 & s23s4 & J_{56} \\\end{array} \right) \end{split}\]

with

\[\begin{split}\begin{array}{l} J_{11} = -s1(-c23a3+s23d4+a1+a2c2) \\J_{21} = c1(-c23a3+s23d4+a1+a2c2) \\J_{12} = c1(s23a3+c23d4-a2s2) \\J_{22} = s1(s23a3+c23d4-a2s2) \\J_{32} = c23a3-s23d4-a2c2 \\J_{13} = c1(a3(s2c3+c2s3)+(-s2s3+c2c3)d4)\\J_{23} = s1(a3(s2c3+c2s3)+(-s2s3+c2c3)d4)\\J_{33} = -a3(s2s3-c2c3)-d4(s2c3+c2s3)\\J_{45} = -c23c1s4-s1c4\\J_{55} = c1c4-c23s1s4\\J_{46} = (c1c23c4-s1s4)s5+c1s23c5\\J_{56} = (s1c23c4+c1s4)s5+s1s23c5\\J_{66} = -s23c4s5+c23c5\\\end{array} \end{split}\]

Note

See get_fJe() , get_eJe()

Parameters:
q: visp._visp.core.ColVector

A six-dimension vector that contains the joint positions of the robot expressed in radians.

fJw: visp._visp.core.Matrix

Robot jacobian \({^f}{\bf J}_w\) that express the velocity of the point w (origin of the wrist frame) in the robot reference frame.

get_fMc(*args, **kwargs)

Overloaded function.

  1. get_fMc(self: visp._visp.robot.Viper, q: visp._visp.core.ColVector) -> visp._visp.core.HomogeneousMatrix

Compute the forward kinematics (direct geometric model) as an homogeneous matrix.

By forward kinematics we mean here the position and the orientation of the camera relative to the base frame given the joint positions of all the six joints.

\[^f{\bf M}_c = ^f{\bf M}_e \; ^e{\bf M}_c \]

This method is the same than getForwardKinematics(const vpColVector & q).

Note

See getForwardKinematics(const vpColVector & q), get_fMe() , get_eMc()

Parameters:
q

Vector of six joint positions expressed in radians.

Returns:

The homogeneous matrix corresponding to the direct geometric model which expresses the transformation between the base frame and the camera frame (fMc).

  1. get_fMc(self: visp._visp.robot.Viper, q: visp._visp.core.ColVector, fMc: visp._visp.core.HomogeneousMatrix) -> None

Compute the forward kinematics (direct geometric model) as an homogeneous matrix.

By forward kinematics we mean here the position and the orientation of the camera relative to the base frame given the six joint positions.

\[^f{\bf M}_c = ^f{\bf M}_e \; {^e}{\bf M}_c \]

Note

See get_fMe() , get_eMc()

Parameters:
q

Vector of six joint positions expressed in radians.

fMc

The homogeneous matrix \(^f{\bf M}_c\) corresponding to the direct geometric model which expresses the transformation between the fix frame and the camera frame.

get_fMe(self, q: visp._visp.core.ColVector, fMe: visp._visp.core.HomogeneousMatrix) None

Compute the forward kinematics (direct geometric model) as an homogeneous matrix \({^f}{\bf M}_e\) .

By forward kinematics we mean here the position and the orientation of the end effector with respect to the base frame given the motor positions of all the six joints.

\[\begin{split}{^f}M_e = \left(\begin{array}{cccc} r_{11} & r_{12} & r_{13} & t_x \\r_{21} & r_{22} & r_{23} & t_y \\r_{31} & r_{32} & r_{33} & t_z \\\end{array} \right) \end{split}\]

with

\[\begin{split}\begin{array}{l} r_{11} = c1(c23(c4c5c6-s4s6)-s23s5c6)-s1(s4c5c6+c4s6) \\r_{21} = -s1(c23(-c4c5c6+s4s6)+s23s5c6)+c1(s4c5c6+c4s6) \\r_{31} = s23(s4s6-c4c5c6)-c23s5c6 \\\\r_{12} = -c1(c23(c4c5s6+s4c6)-s23s5s6)+s1(s4c5s6-c4c6)\\r_{22} = -s1(c23(c4c5s6+s4c6)-s23s5s6)-c1(s4c5s6-c4c6)\\r_{32} = s23(c4c5s6+s4c6)+c23s5s6\\\\r_{13} = c1(c23c4s5+s23c5)-s1s4s5\\r_{23} = s1(c23c4s5+s23c5)+c1s4s5\\r_{33} = -s23c4s5+c23c5\\\\t_x = c1(c23(c4s5d6-a3)+s23(c5d6+d4)+a1+a2c2)-s1s4s5d6\\t_y = s1(c23(c4s5d6-a3)+s23(c5d6+d4)+a1+a2c2)+c1s4s5d6\\t_z = s23(a3-c4s5d6)+c23(c5d6+d4)-a2s2+d1\\\end{array} \end{split}\]

Note that this transformation can also be computed by considering the wrist frame \({^f}{\bf M}_e = {^f}{\bf M}_w *{^w}{\bf M}_e\) .

#include <visp3/robot/vpViper.h>

#ifdef ENABLE_VISP_NAMESPACE
using namespace VISP_NAMESPACE_NAME;
#endif

int main()
{
  vpViper robot;
  vpColVector q(6); // The measured six joint positions

  vpHomogeneousMatrix fMe; // Transformation from fix frame to end-effector
  robot.get_fMe(q, fMe); // Get the forward kinematics

  // The forward kinematics can also be computed by considering the wrist frame
  vpHomogeneousMatrix fMw; // Transformation from fix frame to wrist frame
  robot.get_fMw(q, fMw);
  vpHomogeneousMatrix wMe; // Transformation from wrist frame to end-effector
  robot.get_wMe(wMe); // Constant transformation

  // Compute the forward kinematics
  fMe = fMw * wMe;
}
Parameters:
q: visp._visp.core.ColVector

A 6-dimension vector that contains the 6 joint positions expressed in radians.

fMe: visp._visp.core.HomogeneousMatrix

The homogeneous matrix \({^f}{\bf M}_e\) corresponding to the direct geometric model which expresses the transformation between the fix frame and the end effector frame.

get_fMw(self, q: visp._visp.core.ColVector, fMw: visp._visp.core.HomogeneousMatrix) None

Compute the transformation between the fix frame and the wrist frame. The wrist frame is located on the intersection of the 3 last rotations.

\[\begin{split}{^f}M_w = \left(\begin{array}{cccc} r_{11} & r_{12} & r_{13} & t_x \\r_{21} & r_{22} & r_{23} & t_y \\r_{31} & r_{32} & r_{33} & t_z \\\end{array} \right) \end{split}\]

with

\[\begin{split}\begin{array}{l} r_{11} = c1(c23(c4c5c6-s4s6)-s23s5c6)-s1(s4c5c6+c4s6) \\r_{21} = -s1(c23(-c4c5c6+s4s6)+s23s5c6)+c1(s4c5c6+c4s6) \\r_{31} = s23(s4s6-c4c5c6)-c23s5c6 \\\\r_{12} = -c1(c23(c4c5s6+s4c6)-s23s5s6)+s1(s4c5s6-c4c6)\\r_{22} = -s1(c23(c4c5s6+s4c6)-s23s5s6)-c1(s4c5s6-c4c6)\\r_{32} = s23(c4c5s6+s4c6)+c23s5s6\\\\r_{13} = c1(c23c4s5+s23c5)-s1s4s5\\r_{23} = s1(c23c4s5+s23c5)+c1s4s5\\r_{33} = -s23c4s5+c23c5\\\\t_x = c1(-c23a3+s23d4+a1+a2c2)\\t_y = s1(-c23a3+s23d4+a1+a2c2)\\t_z = s23a3+c23d4-a2s2+d1\\\end{array} \end{split}\]
Parameters:
q: visp._visp.core.ColVector

A 6-dimension vector that contains the 6 joint positions expressed in radians.

fMw: visp._visp.core.HomogeneousMatrix

The homogeneous matrix corresponding to the transformation between the fix frame and the wrist frame (fMw).

get_wMe(self, wMe: visp._visp.core.HomogeneousMatrix) None

Return the transformation between the wrist frame and the end-effector. The wrist frame is located on the intersection of the 3 last rotations.

Parameters:
wMe: visp._visp.core.HomogeneousMatrix

The homogeneous matrix corresponding to the transformation between the wrist frame and the end-effector frame (wMe).

init(*args, **kwargs)

Overloaded function.

  1. init(self: visp._visp.robot.RobotViper850) -> None

Initialise the connection with the MotionBox or low level controller, send the default eMc homogeneous matrix, power on the robot and wait 1 sec before returning to be sure the initialisation is done.

Warning

This method sets the camera extrinsic parameters (matrix eMc) to the one obtained by calibration with a camera projection model without distortion by calling init(vpViper850::defaultCameraRobot). If you want to set the extrinsic camera parameters to those obtained with a camera perspective model including the distortion you have to call the init(vpViper850::vpViper850CameraRobotType, vpCameraParameters::vpCameraParametersProjType ) method. If you want to set custom extrinsic camera parameters you have to call the init(vpViper850::vpToolType, const vpHomogeneousMatrix&) method.

Note

See vpCameraParameters , init ( vpViper850::vpToolType , vpCameraParameters::vpCameraParametersProjType ), init(vpViper850::vpToolType, const vpHomogeneousMatrix&) , init(vpViper850::vpToolType, const std::string&)

  1. init(self: visp._visp.robot.RobotViper850, tool: visp._visp.robot.Viper850.ToolType, projModel: visp._visp.core.CameraParameters.CameraParametersProjType) -> None

Initialize the robot kinematics with the extrinsic calibration parameters associated to a specific camera (set the eMc homogeneous parameters in the low level controller) and also get the joint limits from the low-level controller.

The eMc parameters depend on the camera and the projection model in use.

To set the extrinsic camera parameters related to the \(^e{\bf M}_c\) matrix obtained with a camera perspective projection model including the distortion, use the code below:

#include <visp3/core/vpCameraParameters.h>
#include <visp3/core/vpImage.h>
#include <visp3/robot/vpRobotViper850.h>
#include <visp3/sensor/vp1394TwoGrabber.h>

#ifdef ENABLE_VISP_NAMESPACE
using namespace VISP_NAMESPACE_NAME;
#endif

int main()
{
#ifdef VISP_HAVE_VIPER850
  vpRobotViper850 robot;

  // Set the extrinsic camera parameters obtained with a perspective
  // projection model including a distortion parameter
  robot.init(vpViper850::TOOL_MARLIN_F033C_CAMERA, vpCameraParameters::perspectiveProjWithDistortion);

Now, you can get the intrinsic camera parameters associated to an image acquired by the camera attached to the robot, with:

  vpImage<unsigned char> I(480, 640);

  // Get an image from the camera attached to the robot
#ifdef VISP_HAVE_DC1394
  vp1394TwoGrabber g;
  g.acquire(I);
#endif
  vpCameraParameters cam;
  robot.getCameraParameters(cam, I);
  // In cam, you get the intrinsic parameters of the projection model
  // with distortion.
#endif
}

Note

See vpCameraParameters , init(vpViper850::vpToolType, const vpHomogeneousMatrix&) , init(vpViper850::vpToolType, const std::string&)

Parameters:
tool

Tool to use.

projModel

Projection model associated to the camera.

  1. init(self: visp._visp.robot.RobotViper850, tool: visp._visp.robot.Viper850.ToolType, filename: str) -> None

Initialize the robot kinematics (set the eMc homogeneous parameters in the low level controller) from a file and also get the joint limits from the low-level controller.

To set the transformation parameters related to the \(^e{\bf M}_c\) matrix, use the code below:

#include <visp3/robot/vpRobotViper850.h>

#ifdef ENABLE_VISP_NAMESPACE
using namespace VISP_NAMESPACE_NAME;
#endif

int main()
{
#ifdef VISP_HAVE_VIPER850
  vpRobotViper850 robot;

  // Set the transformation between the end-effector frame
  // and the tool frame from a file
  std::string filename("./EffectorToolTransformation.cnf");

  robot.init(vpViper850::TOOL_CUSTOM, filename);
#endif
}

The configuration file should have the form below:

# Start with any number of consecutive lines
# beginning with the symbol '#'
#
# The 3 following lines contain the name of the camera,
# the rotation parameters of the geometric transformation
# using the Euler angles in degrees with convention XYZ and
# the translation parameters expressed in meters
CAMERA CameraName
eMc_ROT_XYZ 10.0 -90.0 20.0
eMc_TRANS_XYZ  0.05 0.01 0.06

Note

See init() , init ( vpViper850::vpToolType , vpCameraParameters::vpCameraParametersProjType ), init(vpViper850::vpToolType, const vpHomogeneousMatrix&)

Parameters:
tool

Tool to use.

filename

Path of the configuration file containing the transformation between the end-effector frame and the tool frame.

  1. init(self: visp._visp.robot.RobotViper850, tool: visp._visp.robot.Viper850.ToolType, eMc_: visp._visp.core.HomogeneousMatrix) -> None

Initialize the robot kinematics with user defined parameters (set the eMc homogeneous parameters in the low level controller) and also get the joint limits from the low-level controller.

To set the transformation parameters related to the \(^e{\bf M}_c\) matrix, use the code below:

#include <visp3/core/vpHomogeneousMatrix.h>
#include <visp3/robot/vpRobotViper850.h>

#ifdef ENABLE_VISP_NAMESPACE
using namespace VISP_NAMESPACE_NAME;
#endif

int main()
{
#ifdef VISP_HAVE_VIPER850
  vpRobotViper850 robot;

  // Set the transformation between the end-effector frame
  // and the tool frame.
  vpHomogeneousMatrix eMc(0.001, 0.0, 0.1, 0.0, 0.0, M_PI/2);

  robot.init(vpViper850::TOOL_CUSTOM, eMc);
#endif
}

Note

See vpCameraParameters , init() , init ( vpViper850::vpToolType , vpCameraParameters::vpCameraParametersProjType ), init(vpViper850::vpToolType, const std::string&)

Parameters:
tool

Tool to use.

eMc_

Transformation between the end-effector frame and the tool frame.

  1. init(self: visp._visp.robot.Viper850) -> None

Initialize the robot with the default tool vpViper850::defaultTool .

  1. init(self: visp._visp.robot.Viper850, camera_extrinsic_parameters: str) -> None

Read files containing the constant parameters related to the robot tools in order to set the end-effector to tool transformation.

Parameters:
camera_extrinsic_parameters

Filename containing the camera extrinsic parameters.

  1. init(self: visp._visp.robot.Viper850, tool: visp._visp.robot.Viper850.ToolType, projModel: visp._visp.core.CameraParameters.CameraParametersProjType) -> None

Set the constant parameters related to the robot kinematics and to the end-effector to camera transformation ( \(^e{\bf M}c\) ) corresponding to the camera extrinsic parameters. These last parameters depend on the camera and projection model in use and are loaded from predefined files or parameters.

Warning

If the macro VISP_HAVE_VIPER850_DATA is defined in vpConfig.h this function reads the camera extrinsic parameters from the file corresponding to the specified camera type and projection type. Otherwise corresponding default parameters are loaded.

Note

See init(vpViper850::vpToolType, const std::string&) , init(vpViper850::vpToolType, const vpHomogeneousMatrix&)

Parameters:
tool

Camera in use.

  1. init(self: visp._visp.robot.Viper850, tool: visp._visp.robot.Viper850.ToolType, filename: str) -> None

Set the type of tool attached to the robot and transformation between the end-effector and the tool ( \(^e{\bf M}c\) ). This last parameter is loaded from a file.

The configuration file should have the form below:

# Start with any number of consecutive lines
# beginning with the symbol '#'
#
# The 3 following lines contain the name of the camera,
# the rotation parameters of the geometric transformation
# using the Euler angles in degrees with convention XYZ and
# the translation parameters expressed in meters
CAMERA CameraName
eMc_ROT_XYZ 10.0 -90.0 20.0
eMc_TRANS_XYZ  0.05 0.01 0.06

Note

See init ( vpViper850::vpToolType , vpCameraParameters::vpCameraParametersProjType ), init ( vpViper850::vpToolType , const vpHomogeneousMatrix &)

Parameters:
tool

Type of tool in use.

filename

Path of the configuration file containing the transformation between the end-effector frame and the tool frame.

  1. init(self: visp._visp.robot.Viper850, tool: visp._visp.robot.Viper850.ToolType, eMc_: visp._visp.core.HomogeneousMatrix) -> None

Set the type of tool attached to the robot and the transformation between the end-effector and the tool ( \(^e{\bf M}c\) ).

Note

See init ( vpViper850::vpToolType , vpCameraParameters::vpCameraParametersProjType ), init(vpViper850::vpToolType, const std::string&)

Parameters:
tool

Type of tool in use.

eMc_

Homogeneous matrix representation of the transformation between the end-effector frame and the tool frame.

move(self, filename: str) None

Moves the robot to the joint position specified in the filename. The positioning velocity is set to 10% of the robot maximal velocity.

Note

See readPosFile

Parameters:
filename: str

File containing a joint position.

openGripper(self) None

Open the pneumatic two fingers gripper.

Note

See closeGripper()

parseConfigFile(self, filename: str) None

This function gets the robot constant parameters from a file.

Parameters:
filename: str

File name containing the robot constant parameters, like the hand-to-eye transformation.

powerOff(self) None

Power off the robot.

Note

See powerOn() , getPowerState()

powerOn(self) None

Power on the robot.

Note

See powerOff() , getPowerState()

static readPosFile(filename: str, q: visp._visp.core.ColVector) bool

Read joint positions in a specific Viper850 position file.

This position file has to start with a header. The six joint positions are given after the “R:” keyword. The first 3 values correspond to the joint translations X,Y,Z expressed in meters. The 3 last values correspond to the joint rotations A,B,C expressed in degres to be more representative for the user. Theses values are then converted in radians in q . The character “#” starting a line indicates a comment.

A typical content of such a file is given below:

#Viper - Position - Version 1.0
# file: "myposition.pos "
#
# R: A B C D E F
# Joint position in degrees
#

R: 0.1 0.3 -0.25 -80.5 80 0

The code below shows how to read a position from a file and move the robot to this position.

#include <visp3/core/vpColVector.h>
#include <visp3/robot/vpRobotViper850.h>

#ifdef ENABLE_VISP_NAMESPACE
using namespace VISP_NAMESPACE_NAME;
#endif

int main()
{
#ifdef VISP_HAVE_VIPER850
  vpRobotViper850 robot;

  // Enable the position control of the robot
  robot.setRobotState(vpRobot::STATE_POSITION_CONTROL);

  // Get the current robot joint positions
  vpColVector q;        // Current joint position
  robot.getPosition(vpRobot::ARTICULAR_FRAME, q);

  // Save this position in a file named "current.pos"
  robot.savePosFile("current.pos", q);

  // Get the position from a file and move to the registered position
  robot.readPosFile("current.pos", q); // Set the joint position from the file

  robot.setPositioningVelocity(5); // Positioning velocity set to 5%
  robot.setPosition(vpRobot::ARTICULAR_FRAME, q); // Move to the joint position
#endif
}

Note

See savePosFile()

Parameters:
filename: str

Name of the position file to read.

q: visp._visp.core.ColVector

The six joint positions. Values are expressed in radians.

Returns:

true if the positions were successfully readen in the file. false, if an error occurs.

static saturateVelocities(v_in: visp._visp.core.ColVector, v_max: visp._visp.core.ColVector, verbose: bool = false) visp._visp.core.ColVector

Saturate velocities.

The code below shows how to use this static method in order to saturate a velocity skew vector.

#include <iostream>

#include <visp3/robot/vpRobot.h>

#ifdef ENABLE_VISP_NAMESPACE
using namespace VISP_NAMESPACE_NAME;
#endif

int main()
{
  // Set a velocity skew vector
  vpColVector v(6);
  v[0] = 0.1;               // vx in m/s
  v[1] = 0.2;               // vy
  v[2] = 0.3;               // vz
  v[3] = vpMath::rad(10);   // wx in rad/s
  v[4] = vpMath::rad(-10);  // wy
  v[5] = vpMath::rad(20);   // wz

  // Set the maximal allowed velocities
  vpColVector v_max(6);
  for (int i=0; i<3; i++)
    v_max[i] = 0.3;             // in translation (m/s)
  for (int i=3; i<6; i++)
    v_max[i] = vpMath::rad(10); // in rotation (rad/s)

  // Compute the saturated velocity skew vector
  vpColVector v_sat = vpRobot::saturateVelocities(v, v_max, true);

  std::cout << "v    : " << v.t() << std::endl;
  std::cout << "v max: " << v_max.t() << std::endl;
  std::cout << "v sat: " << v_sat.t() << std::endl;

  return 0;
}
Parameters:
v_in: visp._visp.core.ColVector

Vector of input velocities to saturate. Translation velocities should be expressed in m/s while rotation velocities in rad/s.

v_max: visp._visp.core.ColVector

Vector of maximal allowed velocities. Maximal translation velocities should be expressed in m/s while maximal rotation velocities in rad/s.

verbose: bool = false

Print a message indicating which axis causes the saturation.

Returns:

Saturated velocities.

static savePosFile(filename: str, q: visp._visp.core.ColVector) bool

Save joint (articular) positions in a specific Viper850 position file.

This position file starts with a header on the first line. After convertion of the rotations in degrees, the joint position q is written on a line starting with the keyword “R: “. See readPosFile() documentation for an example of such a file.

Note

See readPosFile()

Parameters:
filename: str

Name of the position file to create.

q: visp._visp.core.ColVector

The six joint positions to save in the filename. Values are expressed in radians.

Returns:

true if the positions were successfully saved in the file. false, if an error occurs.

setMaxRotationVelocity(*args, **kwargs)

Overloaded function.

  1. setMaxRotationVelocity(self: visp._visp.robot.RobotViper850, w_max: float) -> None

Set the maximal rotation velocity that can be sent to the robot during a velocity control.

Parameters:
w_max

Maximum rotation velocity expressed in rad/s.

  1. setMaxRotationVelocity(self: visp._visp.robot.Robot, maxVr: float) -> None

Set the maximal rotation velocity that can be sent to the robot during a velocity control.

setMaxRotationVelocityJoint6(self, w6_max: float) None

Set the maximal rotation velocity on joint 6 that is used only during velocity joint control.

This function affects only the velocities that are sent as joint velocities.

vpRobotViper850 robot;
robot.setMaxRotationVelocity( vpMath::rad(20) );
robot.setMaxRotationVelocityJoint6( vpMath::rad(50) );

robot.setRobotState(vpRobot::STATE_VELOCITY_CONTROL);
robot.setVelocity(ARTICULAR_FRAME, v);
Parameters:
w6_max: float

Maximum rotation velocity expressed in rad/s on joint 6.

setMaxTranslationVelocity(self, maxVt: float) None

Set the maximal translation velocity that can be sent to the robot during a velocity control.

setPosition(*args, **kwargs)

Overloaded function.

  1. setPosition(self: visp._visp.robot.RobotViper850, frame: visp._visp.robot.Robot.ControlFrameType, position: visp._visp.core.ColVector) -> None

  2. setPosition(self: visp._visp.robot.RobotViper850, frame: visp._visp.robot.Robot.ControlFrameType, pos1: float, pos2: float, pos3: float, pos4: float, pos5: float, pos6: float) -> None

  3. setPosition(self: visp._visp.robot.RobotViper850, filename: str) -> None

Move to an absolute joint position with a given percent of max velocity. The robot state is set to position control. The percent of max velocity is to set with setPositioningVelocity() . The position to reach is defined in the position file.

This method has the same behavior than the sample code given below;

#include <visp3/core/vpColVector.h>
#include <visp3/robot/vpRobotViper850.h>

#ifdef ENABLE_VISP_NAMESPACE
using namespace VISP_NAMESPACE_NAME;
#endif

int main()
{
#ifdef VISP_HAVE_VIPER850
  vpColVector q;
  vpRobotViper850 robot;

  robot.readPosFile("MyPositionFilename.pos", q);
  robot.setRobotState(vpRobot::STATE_POSITION_CONTROL);
  robot.setPosition(vpRobot::ARTICULAR_FRAME, q);
#endif
}

Note

See setPositioningVelocity()

Parameters:
filename

Name of the position file to read. The readPosFile() documentation shows a typical content of such a position file.

setPositioningVelocity(self, velocity: float) None

Set the maximal velocity percentage to use for a position control.

The default positioning velocity is defined by vpRobotViper850::m_defaultPositioningVelocity . This method allows to change this default positioning velocity

#include <visp3/core/vpColVector.h>
#include <visp3/robot/vpRobotViper850.h>

#ifdef ENABLE_VISP_NAMESPACE
using namespace VISP_NAMESPACE_NAME;
#endif

int main()
{
#ifdef VISP_HAVE_VIPER850
  vpColVector position(6);
  position = 0; // position in rad

  vpRobotViper850 robot;

  robot.setRobotState(vpRobot::STATE_POSITION_CONTROL);

  // Set the max velocity to 20%
  robot.setPositioningVelocity(20);

  // Moves the robot to the joint position [0,0,0,0,0,0]
  robot.setPosition(vpRobot::ARTICULAR_FRAME, position);
#endif
}

Note

See getPositioningVelocity()

Parameters:
velocity: float

Percentage of the maximal velocity. Values should be in ]0:100].

setRobotState(*args, **kwargs)

Overloaded function.

  1. setRobotState(self: visp._visp.robot.RobotViper850, newState: visp._visp.robot.Robot.RobotStateType) -> visp._visp.robot.Robot.RobotStateType

Change the robot state.

Parameters:
newState

New requested robot state.

  1. setRobotState(self: visp._visp.robot.Robot, newState: visp._visp.robot.Robot.RobotStateType) -> visp._visp.robot.Robot.RobotStateType

setVelocity(self, frame: visp._visp.robot.Robot.ControlFrameType, velocity: visp._visp.core.ColVector) None
setVerbose(self, verbose: bool) None
set_eMc(*args, **kwargs)

Overloaded function.

  1. set_eMc(self: visp._visp.robot.RobotViper850, eMc_: visp._visp.core.HomogeneousMatrix) -> None

Set the geometric transformation between the end-effector frame and the tool frame in the low level controller.

Warning

This function overwrite the transformation parameters that were potentially set using one of the init functions

Parameters:
eMc_

Transformation between the end-effector frame and the tool frame.

  1. set_eMc(self: visp._visp.robot.RobotViper850, etc_: visp._visp.core.TranslationVector, erc_: visp._visp.core.RxyzVector) -> None

Set the geometric transformation between the end-effector frame and the tool frame in the low level controller.

Warning

This function overwrite the transformation parameters that were potentially set using one of the init functions.

Parameters:
etc_

Translation between the end-effector frame and the tool frame.

erc_

Rotation between the end-effector frame and the tool frame using the Euler angles in radians with the XYZ convention.

stopMotion(self) None

Stop the robot and set the robot state to vpRobot::STATE_STOP .

unbiasForceTorqueSensor(self) None

Unbias the force/torque sensor.

Note

See biasForceTorqueSensor() , getForceTorque()