RobotWireFrameSimulator¶
- class RobotWireFrameSimulator¶
Bases:
RobotSimulator
This class aims to be a basis used to create all the simulators of robots.
Thus in this class you will find all the parameters and methods which are necessary to create a simulator. Several methods are pure virtual. In this case it means that they are specific to the each robot, for example the computation of the geometrical model.
Warning
This class uses threading capabilities. Thus on Unix-like platforms, the libpthread third-party library need to be installed. On Windows, we use the native threading capabilities.
Methods
Get the parameters of the virtual external camera.
Get the external camera's position relative to the the world reference frame.
Overloaded function.
Get the pose between the object and the robot's camera.
Get the pose between the object and the fixed world frame.
Overloaded function.
Set the color used to display the camera in the external view.
Set the flag used to force the sampling time in the thread computing the robot's displacement to a constant value; see setSamplingTime() .
Set the color used to display the object at the current position in the robot's camera view.
Set the desired position of the robot's camera relative to the object.
Set the color used to display the object at the desired position in the robot's camera view.
Set the external camera point of view.
Specify the thickness of the graphics drawings.
Overloaded function.
Set the parameter which enable or disable the singularity management.
Activates extra printings when the robot reaches joint limits...
Set the pose between the object and the fixed world frame.
Inherited Methods
STATE_FORCE_TORQUE_CONTROL
TOOL_FRAME
Set the maximal translation velocity that can be sent to the robot during a velocity control.
STATE_VELOCITY_CONTROL
Saturate velocities.
REFERENCE_FRAME
Get the maximal translation velocity that can be sent to the robot during a velocity control.
Robot control frames.
Robot control frames.
Return robot degrees of freedom number.
Get the maximal rotation velocity that can be sent to the robot during a velocity control.
STATE_STOP
STATE_POSITION_CONTROL
STATE_ACCELERATION_CONTROL
CAMERA_FRAME
Set the maximal rotation velocity that can be sent to the robot during a velocity control.
END_EFFECTOR_FRAME
ARTICULAR_FRAME
MIXT_FRAME
Return the sampling time.
JOINT_STATE
Operators
__doc__
__module__
Attributes
ARTICULAR_FRAME
CAMERA_FRAME
END_EFFECTOR_FRAME
I
JOINT_STATE
MIXT_FRAME
MODEL_3D
MODEL_DH
REFERENCE_FRAME
STATE_ACCELERATION_CONTROL
STATE_FORCE_TORQUE_CONTROL
STATE_POSITION_CONTROL
STATE_STOP
STATE_VELOCITY_CONTROL
TOOL_FRAME
__annotations__
- class ControlFrameType(self, value: int)¶
Bases:
pybind11_object
Robot control frames.
Values:
REFERENCE_FRAME: Corresponds to a fixed reference frame attached to the robot structure.
ARTICULAR_FRAME: Corresponds to the joint state. This value is deprecated. You should rather use vpRobot::JOINT_STATE .
JOINT_STATE: Corresponds to the joint state.
END_EFFECTOR_FRAME: Corresponds to robot end-effector frame.
CAMERA_FRAME: Corresponds to a frame attached to the camera mounted on the robot end-effector.
TOOL_FRAME: Corresponds to a frame attached to the tool (camera, gripper…) mounted on the robot end-effector. This value is equal to vpRobot::CAMERA_FRAME .
MIXT_FRAME: Corresponds to a “virtual” frame where translations are expressed in the reference frame, and rotations in the camera frame.
- class RobotStateType(self, value: int)¶
Bases:
pybind11_object
Robot control frames.
Values:
REFERENCE_FRAME: Corresponds to a fixed reference frame attached to the robot structure.
ARTICULAR_FRAME: Corresponds to the joint state. This value is deprecated. You should rather use vpRobot::JOINT_STATE .
JOINT_STATE: Corresponds to the joint state.
END_EFFECTOR_FRAME: Corresponds to robot end-effector frame.
CAMERA_FRAME: Corresponds to a frame attached to the camera mounted on the robot end-effector.
TOOL_FRAME: Corresponds to a frame attached to the tool (camera, gripper…) mounted on the robot end-effector. This value is equal to vpRobot::CAMERA_FRAME .
MIXT_FRAME: Corresponds to a “virtual” frame where translations are expressed in the reference frame, and rotations in the camera frame.
- __init__(*args, **kwargs)¶
- getExternalCameraParameters(self) visp._visp.core.CameraParameters ¶
Get the parameters of the virtual external camera.
- Returns:
It returns the camera parameters.
- getExternalCameraPosition(self) visp._visp.core.HomogeneousMatrix ¶
Get the external camera’s position relative to the the world reference frame.
- Returns:
the main external camera position relative to the the world reference frame.
- getInternalView(*args, **kwargs)¶
Overloaded function.
getInternalView(self: visp._visp.robot.RobotWireFrameSimulator, I: visp._visp.core.ImageRGBa) -> None
Get the view of the camera’s robot.
According to the initialisation method you used, the current position and maybee the desired position of the object are displayed.
Warning
: The objects are displayed thanks to overlays. The image I is not modified.
getInternalView(self: visp._visp.robot.RobotWireFrameSimulator, I: visp._visp.core.ImageGray) -> None
Get the view of the camera’s robot.
According to the initialisation method you used, the current position and maybee the desired position of the object are displayed.
Warning
: The objects are displayed thanks to overlays. The image I is not modified.
- getMaxRotationVelocity(self) float ¶
Get the maximal rotation velocity that can be sent to the robot during a velocity control.
- Returns:
Maximum rotation velocity expressed in rad/s.
- getMaxTranslationVelocity(self) float ¶
Get the maximal translation velocity that can be sent to the robot during a velocity control.
- Returns:
Maximum translational velocity expressed in m/s.
- getPosition(self, frame: visp._visp.robot.Robot.ControlFrameType) visp._visp.core.ColVector ¶
- getRobotState(self) visp._visp.robot.Robot.RobotStateType ¶
- getSamplingTime(self) float ¶
Return the sampling time.
- Returns:
Sampling time in second used to compute the robot displacement from the velocity applied to the robot during this time.
- get_cMo(self) visp._visp.core.HomogeneousMatrix ¶
Get the pose between the object and the robot’s camera.
- Returns:
The pose between the object and the fixed world frame.
- get_fMo(self) visp._visp.core.HomogeneousMatrix ¶
Get the pose between the object and the fixed world frame.
- Returns:
The pose between the object and the fixed world frame.
- initScene(*args, **kwargs)¶
Overloaded function.
initScene(self: visp._visp.robot.RobotWireFrameSimulator, obj: visp._visp.robot.WireFrameSimulator.SceneObject, desiredObject: visp._visp.robot.WireFrameSimulator.SceneDesiredObject) -> None
initScene(self: visp._visp.robot.RobotWireFrameSimulator, obj: str, desiredObject: str) -> None
Initialize the display. It enables to choose the type of scene which will be used to display the object at the current position and at the desired position.
Here you can use the scene you want. You have to set the path to the .bnd file which is a scene file. It is also possible to use a vrml (.wrl) file.
- Parameters:
- obj
Path to the scene file you want to use.
initScene(self: visp._visp.robot.RobotWireFrameSimulator, obj: visp._visp.robot.WireFrameSimulator.SceneObject) -> None
initScene(self: visp._visp.robot.RobotWireFrameSimulator, obj: str) -> None
Initialize the display. It enables to choose the type of scene which will be used to display the object at the current position. The object at the desired position is not displayed.
Here you can use the scene you want. You have to set the path to the .bnd file which is a scene file, or the vrml file.
- Parameters:
- obj
Path to the scene file you want to use.
- static saturateVelocities(v_in: visp._visp.core.ColVector, v_max: visp._visp.core.ColVector, verbose: bool = false) visp._visp.core.ColVector ¶
Saturate velocities.
The code below shows how to use this static method in order to saturate a velocity skew vector.
#include <iostream> #include <visp3/robot/vpRobot.h> #ifdef ENABLE_VISP_NAMESPACE using namespace VISP_NAMESPACE_NAME; #endif int main() { // Set a velocity skew vector vpColVector v(6); v[0] = 0.1; // vx in m/s v[1] = 0.2; // vy v[2] = 0.3; // vz v[3] = vpMath::rad(10); // wx in rad/s v[4] = vpMath::rad(-10); // wy v[5] = vpMath::rad(20); // wz // Set the maximal allowed velocities vpColVector v_max(6); for (int i=0; i<3; i++) v_max[i] = 0.3; // in translation (m/s) for (int i=3; i<6; i++) v_max[i] = vpMath::rad(10); // in rotation (rad/s) // Compute the saturated velocity skew vector vpColVector v_sat = vpRobot::saturateVelocities(v, v_max, true); std::cout << "v : " << v.t() << std::endl; std::cout << "v max: " << v_max.t() << std::endl; std::cout << "v sat: " << v_sat.t() << std::endl; return 0; }
- Parameters:
- v_in: visp._visp.core.ColVector¶
Vector of input velocities to saturate. Translation velocities should be expressed in m/s while rotation velocities in rad/s.
- v_max: visp._visp.core.ColVector¶
Vector of maximal allowed velocities. Maximal translation velocities should be expressed in m/s while maximal rotation velocities in rad/s.
- verbose: bool = false¶
Print a message indicating which axis causes the saturation.
- Returns:
Saturated velocities.
- setCameraColor(self, col: visp._visp.core.Color) None ¶
Set the color used to display the camera in the external view.
- Parameters:
- col: visp._visp.core.Color¶
The desired color.
- setConstantSamplingTimeMode(self, _constantSamplingTimeMode: bool) None ¶
Set the flag used to force the sampling time in the thread computing the robot’s displacement to a constant value; see setSamplingTime() . It may be useful if the main thread (computing the features) is very time consuming. False by default.
- setCurrentViewColor(self, col: visp._visp.core.Color) None ¶
Set the color used to display the object at the current position in the robot’s camera view.
- Parameters:
- col: visp._visp.core.Color¶
The desired color.
- setDesiredCameraPosition(self, cdMo_: visp._visp.core.HomogeneousMatrix) None ¶
Set the desired position of the robot’s camera relative to the object.
- Parameters:
- cdMo_: visp._visp.core.HomogeneousMatrix¶
The desired pose of the camera.
- setDesiredViewColor(self, col: visp._visp.core.Color) None ¶
Set the color used to display the object at the desired position in the robot’s camera view.
- Parameters:
- col: visp._visp.core.Color¶
The desired color.
- setDisplayRobotType(self, dispType: visp._visp.robot.RobotWireFrameSimulator.DisplayRobotType) None ¶
- setExternalCameraPosition(self, camMf_: visp._visp.core.HomogeneousMatrix) None ¶
Set the external camera point of view.
- Parameters:
- camMf_: visp._visp.core.HomogeneousMatrix¶
The pose of the external camera relative to the world reference frame.
- setMaxRotationVelocity(self, maxVr: float) None ¶
Set the maximal rotation velocity that can be sent to the robot during a velocity control.
- setMaxTranslationVelocity(self, maxVt: float) None ¶
Set the maximal translation velocity that can be sent to the robot during a velocity control.
- setRobotState(self, newState: visp._visp.robot.Robot.RobotStateType) visp._visp.robot.Robot.RobotStateType ¶
- setSamplingTime(*args, **kwargs)¶
Overloaded function.
setSamplingTime(self: visp._visp.robot.RobotWireFrameSimulator, delta_t: float) -> None
Set the sampling time.
Since the wireframe simulator is threaded, the sampling time is set to vpTime::getMinTimeForUsleepCall() / 1000 seconds.
- Parameters:
- delta_t
Sampling time in second used to compute the robot displacement from the velocity applied to the robot during this time.
setSamplingTime(self: visp._visp.robot.RobotSimulator, delta_t: float) -> None
Set the sampling time.
- Parameters:
- delta_t
Sampling time in second used to compute the robot displacement from the velocity applied to the robot during this time.
- setSingularityManagement(self, sm: bool) None ¶
Set the parameter which enable or disable the singularity management.
- setVerbose(self, verbose: bool) None ¶
Activates extra printings when the robot reaches joint limits…
- set_fMo(self, fMo_: visp._visp.core.HomogeneousMatrix) None ¶
Set the pose between the object and the fixed world frame.
- Parameters:
- fMo_: visp._visp.core.HomogeneousMatrix¶
The pose between the object and the fixed world frame.