Automated Grasping: Pincher X100 4-DOF Robot Arm Grasps a Purple Pen

Programmed a Pincher X100 4-DOF robot arm in Python to grasp a purple colored pen.

Overview

I utilized an Intel RealSense D435i camera to locate a purple Northwestern pen and manipulate it using a Trossen PincherX 100 robot arm. My approach involved an initial transformation of the RealSense RGB image into the HSV color space, enabling the extraction of all purple-hued pixels using their specific HSV values. This selection formed a binary map, designating the identified pixels as white while rendering all other pixels black.

Leveraging OpenCV’s contour detection capabilities, I determined the exact pixel coordinates representing the pen’s centroid. This information was pivotal in establishing the pen’s spatial orientation concerning the camera. By aligning this data with the depth image provided by the RealSense, I precisely mapped the pen’s position in the camera’s frame.

Subsequently, through a conversion process, I translated the pen’s position from the camera’s frame to the robot’s frame. This enabled the seamless coordination of the robot arm’s movements towards the pen, allowing it to navigate and securely grasp the targeted object.

Centroid and thresholding of the purple pen

car_setup