ECE capstone design project: autonomous gripper system

Posted on April 03, 2017


Another example of a capstone design project: It's a robotic system that automatically moves items around warehouses and shipping centres.

  • Video transcript:

    Can you tell us the name of your project and its scope?

    Okay, the name of the project is the Autonomous Gripper System. The scope of it is so that it can autonomously pick up objects and then place them wherever you want it to without any human interaction. So, completely by itself.

    Warehouse companies such as Amazon, McMaster-Carr, they cannot autonomously pick up the objects in their warehouses and place them wherever they want to. So, this makes it more efficient, faster and decreases the chance of an object being mishandled because the robots are handling it completely.

    So my partner here, Sam, mostly designed the claw. Using sheet metal, we had to cut out pieces and we 3D printed the gears as well. Then we had to connect all the wires and fasten everything together.

    Tell me a little bit more about the technical aspects

    Yeah, sure, so detecting, it is very accurate. It can detect every object. In terms of positioning over the object, it's quite capable of that.

    One restriction is if you move the objects while the program is running, it is unable to cope with a dynamically moving system. So, it kind of assumes that the boxes are not going to move once it starts scanning the workplace.

    There are two systems that are working together in this project. The first is the gripping mechanism which, as Ghani told you, we build from scratch. That is a mechanism built of stepper motors and six force sensors. On top of there we have an Arduino to control the unit and a driver for each of the motors.

    The other system to work along side that one is the imaging system. So, my partner Aaron developed imaging software using a Microsoft Kinect to develop a 3D point cloud of the work space.

    So, how it works is you develop a point cloud of the space. You find the largest plane and that's assumed to be the floor, so you can remove that from your data and what you're left with is small clusters of points. Then you can run it through an algorithm that we've got to understand the orientation of the box. That then positions. . . you know the orientation and position of the box. So, the robot then positions itself over the box for the hand to grab.

    Can you tell me what kind of problems or issues you guys had to overcome when you were going through your design process?

    Sure, we had lots of problems. The first problem that we had from the hardware prospective was that we fried two of our Arduino boards. So, we did that because we had improperly assumed one of the inputs on the drivers to be an an input when it was an output.

    We learned a lot. We had to actually go through and step-by-step study each driver's schematic and understand the current flow and how it could have possibly happened because we made an assumption that was wrong there. That was our first mistake. There were a few other errors that came just from manufacturing problems. For example there's a little too much play or tolerance in the axles which sometimes can lead to poor performance by the claw.

    So, those are the main hardware problems. Another big problem that we had was last night this giant robot behind me, after moving it, didn't work at all. So, it took a large group of people a long amount of time to figure out which of the hundreds of wires on here was just barely disconnected. So, that was another big problem that we faced.