UR5 Jenga-Playing Robot Arm

Team Name

Jovial McNulty

Timeline

Fall 2018 – Spring 2019

Students

  • Joe Cloud
  • Gabriel Comer
  • Carlos Crane
  • Sammy Hamwi
  • Maxwell Sanders

Abstract

The objective of this project is to implement a Jenga-playing robot system involving a human opponent. Our solution will utilize 3D computer vision to develop scans of the tower’s state, and a vacuum gripper to grip blocks. The human opponent will take turns with the robot performing block pulls and placing them at the top of the stack. The goal isn’t necessarily to defeat the human opponent, but build a system capable of make multiple moves prior to the tower collapsing.

Background

This project serves as an interactive demonstration of computer science concepts with the goal of inspiring the next generation of scientists & engineers. This product is designed to engage K-12 students at showcases and university outreach events. Students are expected to engage with the product by taking turns with the robot in a fun, friendly game of Jenga.

This product is intended to be used by the university, College of Engineering, or the Computer Science & Engineering Department as a recruiting tool for future engineering students. Making this product available commercially to other universities and companies is possible. Though, as this is an end-to-end system designed to demonstrate the possibilities as a STEM student at UT Arlington, it is intended to be administered on behalf of the university. The end-user of this system is a member of the general public. Mattel® states that Jenga is appropriate for players ages six and up.

Project Requirements

  • The UR5 Jenga Playing Robot will have: a robotic arm for movement and 3D scanning (Universal Robotics 5 robotic arm), a camera for utilizing computer vision, a vacuum gripper connected to an air compressor for suction, and a solenoid acting as an off/on switch for the vacuum gripper.
  • The user interface of the UR5 Jenga-Playing Robot Arm will be simple and intuitive. It shall be comprised of a button that determines when the player move is completed, and some display elements that provide feedback to the user. This interface will control the flow of the game, and determine when the UR5 Jenga-Playing Robot Arm can take its turn.
  • Part of the output interface will be an LED strip that indicates whether or not the robot has completed its turn. The LED strip will flash green when the user is taking their turn, and flash red while the UR5 Jenga-Playing Robot Arm is taking its turn and white when scanning the tower.
  • The gripper will need to overcome the levering force a block will exert on the tower as it is going to be removed. The gripper shouldn’t cause any unnecessary force on the tower.
  • The UR5 Jenga-Playing Robot shall pull blocks from the tower and place them on the top using the vacuum gripper without intervention by the user. Once the users has signaled the end of their turn through the button, the user will not be required to touch the tower, or interact with the display in any way, until their next turn.
  • Part of the output interface will be an LED strip that indicates whether or not the robot has completed its turn. The LED strip will flash green when the user is taking their turn, and flash red while the UR5 Jenga-Playing Robot Arm is taking its turn.
  • The UR5 Jenga-Playing Robot’s software will be delivered via download and the software packages used will be downloaded through a package manager. Different software packages will be used to create the UR5 Jenga-Playing Robot. Python will be used for programming the UR5 Jenga-Playing Robot, using the software packages OpenCV and ROS. OpenCV will be utilized in Python for all computer vision, 2D or 3D, tasks the Robot will use to play Jenga. ROS will be used for all communication with the UR5 Robot Arm; it will act as a middle-ware to control the robots components.
  • In order to ensure that the UR5 Jenga-Playing Robot can play games for long enough to stay interesting for users, the UR5 Jenga-Playing Robot shall be able to play against itself for 4-6 turns. Since user skill levels may vary greatly, the performance of the UR5 Jenga-Playing Robot playing turns against itself should provide more consistent and reliable feedback regarding its block-pulling abilities.
  • The UR5 Jenga-Playing Robot shall take no longer than 60 seconds for each of its turns. This ensures that users will be able to maintain interest during the down-time between their own turns. It also ensures that any line to play against the UR5 Jenga-Playing Robot in action at a demonstration will move at a steady pace.
  • The system will use computer vision to successfully play Jenga. Using the open source software package OpenCV, the system will be able to utilize computer vision. The installation will be done through a package manager. As the software is also largely dependent on the ROS ecosystem, software installation and configuration shall be consistent with ROS conventions.

System Overview

The UR5 Jenga-playing robot’s system is split into 2 distinct layers, the hardware layer and the software layer. The hardware layer covers everything in the physical world such as the sensors, the camera, the gripper, and the I/O. The software layer covers what we have decided to call the game engine, which will act as the brain of the operation. All of the interactions between these two layers will be handled with ROS.

The hardware layer contains the sensors, vision, I/O and gripper. It will send information from the UR5 and the vision system to the game engine in order to determine which move it needs to complete. After the Game Engine decides the best course of action it will send directions to navigate the robot to the correct block and course of action. Once a move is completed the physical game interface can be updated so the players know what is happening.

Results

This video demonstrates a block pull and place on the UR5 Jenga-Playing Robot Arm. We were able to successfully pull a block, and place it at the top of the tower using a vacuum gripper. The vacuum gripper is able to grip blocks on command, because of a solenoid that controls its suction as an off/on switch.

 Future Work

After the completion of this project, our team will maintain communication with customers who choose to keep supporting the product. If any problems arise with the product after full delivery, depending on availability we will assist in patching or resolving the issue.

Project Files

Project Charter

System Requirement Specifications

Architectural Design Specification

Detailed Design Specification

Poster

References

[1] Thomas Timm Andersen. Optimizing the Universal Robots ROS driver. Technical University of Denmark, Department of Electrical Engineering, 2015.

[2] Shinya Kimura, Tsutomu Watanabe, and Yasumichi Aiyama. Force based manipulation of Jenga blocks. In Intelligent Robots and Systems (IROS), 2010 IEEE/RSJ International Conference on, pages 4287–4292. IEEE, 2010.

[3] Torsten Kroger, Bernd Finkemeyer, Simon Winkelbach, Lars-Oliver Eble, Sven Molkenstruck, and Friedrich M Wahl. A manipulator plays Jenga. IEEE robotics & automation magazine, 15(3):79–84, 2008.

[4] Chia-Hung Lin. UR5 plays JENGA. pages 1–8. John Hopkins University, 2018.

[5] Elisabeth McClure, Lisa Guernsey, Douglas Clements, Susan Bales, Jennifer Nichols, Nat Kendall Taylor, and Michael Levine. STEM starts early: Grounding science, technology, engineering, and math education in early childhood, 2017.

[6] Philip M. Sadler, Gerhard Sonnert, Zahra Hazari, and Robert Tai. Stability and volatility of STEM career interest in high school: a gender study, 2012.

[7] Jiuguang Wang, Philip Rogers, Lonnie Parker, Douglas Brooks, and Mike Stilman. Robot Jenga: Autonomous and strategic block extraction. In Intelligent Robots and Systems, 2009. IROS 2009. IEEE/RSJ International Conference on, pages 5248–5253. IEEE, 2009.

chenc4