Abstract
Benchmarking is a common practice employed to quantify the performance of various approaches toward the same task. By maintaining a consistent test environment, the inherent behaviours between methods may be distinguished—which is key to progressing a research field. In robotic manipulation research there is a current lack of standardisation, making it challenging to fairly assess and compare various approaches throughout literature. This paper proposes new criteria in conjunction with a benchmarking platform to measure the effectiveness of a grasping pipeline. The proposed benchmarking template offers a testing platform for 2-fingered, vision-based grasp synthesis methodologies. A prototype system was constructed. The prototype was shown to serve as a suitable benchmarking platform for the deployment of various grasp synthesis methodologies. 4000 trials were conducted to evaluate the differing approaches. Results showed that the proposed metrics provide useful insights into the quality of grasp poses produced by a grasp synthesis methodology. Moreover, such metrics provide more comprehensive insights into grasp outcome than traditional methods used to quantify performance of a methodology and present a fair baseline for comparison between different approaches.
Article PDF
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Avoid common mistakes on your manuscript.
References
Xiang, Y., Mottaghi, R., and Savarese, S. Beyond PASCAL: A benchmark for 3D object detection in the wild, In IEEE Winter Conf Appl Comput Vis, 24–26, pp. 75–82 (2014), doi: https://doi.org/10.1109/WACV.2014.6836101
Everingham, M., Van Gool, L., Williams, C.K., Winn, J., Zisserman, A.: The pascal visual object classes (voc) challenge. Int. J. Comput. Vis. 88(2), 303–338 (2010)
Krizhevsky, A., Sutskever, I., Hinton, G.E.: Imagenet classification with deep convolutional neural networks. Adv Neural Inf Processing Syst. 25, 1097–1105 (2012)
Lin, T.Y., Maire, M., Belongie, S., Hays, J., Perona, P., Ramanan, D., Dollár, P., Zitnick, C.L.: Microsoft coco: Common objects in context. In: European conference on computer vision, pp. 740–755. Springer, Cham (2014)
Robotics, I.E.E.E., Magazine, A.: Special issue on replicable and measurable Robotics research. IEEE Robot Autom Mag. 21(3), 153–153 (2014). https://doi.org/10.1109/MRA.2014.2346083
Bonsignorio, F., Del Pobil, A.P.: Toward replicable and measurable robotics research [from the guest editors]. IEEE Robot Autom Mag. 22(3), 32–35 (2015)
Mahler, J., Platt, R., Rodriguez, A., Ciocarlie, M., Dollar, A., Detry, R., Roa, M.A., Yanco, H., Norton, A., Falco, J., Wyk, K.., Messina, E., Leitner, J.'.J.'., Morrison, D., Mason, M., Brock, O., Odhner, L., Kurenkov, A., Matl, M., Goldberg, K.: Guest editorial open discussion of robot grasping benchmarks, protocols, and metrics. IEEE Trans. Autom. Sci. Eng. 15(4), 1440–1442 (2018)
Vuuren, J.J.V., Tang, L., Al-Bahadly, I., Arif, K.M.: A 3-stage machine learning-based novel object grasping methodology. IEEE Access. 8, 74216–74236 (2020). https://doi.org/10.1109/ACCESS.2020.2987341
Vuuren, J. J. V., Tang, L., Al-Bahadly, I., and Arif, K. Towards the autonomous robotic gripping and handling of novel objects, In 2019 14th IEEE Conference on Industrial Electronics and Applications (ICIEA), 19–21 , pp. 1006–1010, (2019) doi: https://doi.org/10.1109/ICIEA.2019.8833691
Jiang, Y., Moseson, S., Saxena, A.: Efficient grasping from RGBD images: Learning using a new rectangle representation. In: 2011 IEEE International conference on robotics and automation, pp. 3304–3311. IEEE (2011)
Chu, F., Xu, R., Vela, P.A.: Real-world multiobject, multigrasp detection. IEEE Robot Autom Lett. 3(4), 3355–3362 (2018). https://doi.org/10.1109/LRA.2018.2852777
L. Pinto and A. Gupta Supersizing self-supervision: Learning to grasp from 50K tries and 700 robot hours, In 2016 IEEE International Conference on Robotics and Automation (ICRA), 16–21, pp. 3406–3413 (2016), doi: https://doi.org/10.1109/ICRA.2016.7487517
Levine, S., Pastor, P., Krizhevsky, A., Ibarz, J., Quillen, D.: Learning hand-eye coordination for robotic grasping with deep learning and large-scale data collection. Int J Robot Res. 37(4–5), 421–436 (2018)
Calli, B., Singh, A., Walsman, A., Srinivasa, S., Abbeel, P., and Dollar, A. M. The YCB object and Model set: Towards common benchmarks for manipulation research, In 2015 International Conference on Advanced Robotics (ICAR), 27–31, pp. 510–517 (2015), doi: https://doi.org/10.1109/ICAR.2015.7251504
Calli, B., Walsman, A., Singh, A., Srinivasa, S., Abbeel, P., Dollar, A.M.: Benchmarking in manipulation research: The YCB object and model set and benchmarking protocols. arXiv. (2015) preprint arXiv:1502.03143
Leitner, J., Tow, A.W., Sünderhauf, N., Dean, J.E., Durham, J.W., Cooper, M., Eich, M., Lehnert, C., Mangels, R., McCool, C., Kujala, P.T.: The ACRV picking benchmark: A robotic shelf picking benchmark to foster reproducible research. In: 2017 IEEE International Conference on Robotics and Automation (ICRA), pp. 4705–4712. IEEE (2017)
Leitner, J.. The ARCV Picking Benchmark (APB). http://juxi.net/acrv-picking-benchmark/. Accessed 9 July 2019
Sainul, I.A., Deb, S., Deb, A.K.: A novel object slicing based grasp planner for 3D object grasping using underactuated robot gripper. In: IECON 2019-45th Annual Conference of the IEEE Industrial Electronics Society, vol. 1, pp. 585–590. IEEE (2019)
Rothling, F., Haschke, R., Steil, J.J., Ritter, H.: Platform portable anthropomorphic grasping with the bielefeld 20-DOF shadow and 9-DOF TUM hand. In: 2007 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 2951–2956. IEEE (2007)
Brown, E., Rodenberg, N., Amend, J., Mozeika, A., Steltz, E., Zakin, M.R., Lipson, H., Jaeger, H.M.: Universal robotic gripper based on the jamming of granular material. Proc. Natl. Acad. Sci. 107(44), 18809–18814 (2010)
Varley, J., Weisz, J., Weiss, J., and Allen, P. Generating multi-fingered robotic grasps via deep learning, In 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 28 Sept.-2 Oct. 2015 2015, pp. 4415–4420, doi: https://doi.org/10.1109/IROS.2015.7354004
Rubert, C., León, B., Morales, A., Sancho-Bru, J.: Characterisation of grasp quality metrics. J. Intell. Robot. Syst. 89(3–4), 319–342 (2018)
Goins, A.K., Carpenter, R., Wong, W.K., Balasubramanian, R.: Evaluating the efficacy of grasp metrics for utilization in a gaussian process-based grasp predictor. In: 2014 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 3353–3360. IEEE (2014)
Kumra, S., Kanan, C.: Robotic grasp detection using deep convolutional neural networks. In: 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 769–776. IEEE (2017)
Caldera, S., Rassau, A., and Chai, D. Robotic Grasp Pose Detection Using Deep Learning, In 2018 15th International Conference on Control, Automation, Robotics and Vision (ICARCV), 18–21, pp. 1966-1972 (2018), doi: https://doi.org/10.1109/ICARCV.2018.8581091
Wang, Z., Li, Z., Wang, B., Liu, H.: Robot grasp detection using multimodal deep convolutional neural networks. Adv. Mech. Eng. 8(9), 1687814016668077 (2016)
Sun, C., Yu, Y., Liu, H., and Gu, J. Robotic grasp detection using extreme learning machine, In 2015 IEEE International Conference on Robotics and Biomimetics (ROBIO), 6–9, pp. 1115–1120 (2015), doi: https://doi.org/10.1109/ROBIO.2015.7418921
J. Mahler et al., "Dex-net 2.0: Deep learning to plan robust grasps with synthetic point clouds and analytic grasp metrics," arXiv preprint arXiv:1703.09312, (2017)
Morrison, D., Corke, P., Leitner, J.: Closing the loop for robotic grasping: A real-time, generative grasp synthesis approach. arXiv. (2018) preprint arXiv:1804.05172
Cheng, H. and Meng, M. Q. A Grasp Pose Detection Scheme with an End-to-End CNN Regression Approach, In 2018 IEEE International Conference on Robotics and Biomimetics (ROBIO), 12–15, pp. 544–549 (2018), doi: https://doi.org/10.1109/ROBIO.2018.8665219
Zeng, A. et al. Robotic Pick-and-Place of Novel Objects in Clutter with Multi-Affordance Grasping and Cross-Domain Image Matching, In 2018 IEEE International Conference on Robotics and Automation (ICRA), 21–25, pp. 1–8 (2018), doi: https://doi.org/10.1109/ICRA.2018.8461044
Zeng, A. et al. Multi-view self-supervised deep learning for 6D pose estimation in the Amazon Picking Challenge, In 2017 IEEE International Conference on Robotics and Automation (ICRA), 29 May-3 June 2017, pp. 1386–1383 (2017), doi: https://doi.org/10.1109/ICRA.2017.7989165
Watson, J., Hughes, J., Iida, F.: Real-world, real-time robotic grasping with convolutional neural networks. In: Annual Conference Towards Autonomous Robotic Systems, pp. 617–626. Springer, Cham (2017)
Miller, A.T., Allen, P.K.: Graspit! A versatile simulator for robotic grasping. IEEE Robot Autom Mag. 11(4), 110–122 (2004)
Miller, A., Allen, P., Santos, V., Valero-Cuevas, F.: From robotic hands to human hands: a visualization and simulation engine for grasping research. An International Journal, Industrial Robot (2005)
Miller, A. and Allen, P.. GraspIt! https://graspit-simulator.github.io/. Accessed 2 Mar 2020
Ulbrich, S., Kappler, D., Asfour, T., Vahrenkamp, N., Bierbaum, A., Przybylski, M., Dillmann, R.: The OpenGRASP benchmarking suite: An environment for the comparative analysis of grasping and dexterous manipulation. In: 2011 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 1761–1767. IEEE (2011)
León, B., Ulbrich, S., Diankov, R., Puche, G., Przybylski, M., Morales, A., Asfour, T., Moisio, S., Bohg, J., Kuffner, J., Dillmann, R.: Opengrasp: a toolkit for robot grasping simulation. In: International Conference on Simulation, Modeling, and Programming for Autonomous Robots, pp. 109–120. Springer, Berlin, Heidelberg (2010)
Qian, W., Xia, Z., Xiong, J., Gan, Y., Guo, Y., Weng, S., Deng, H., Hu, Y., Zhang, J.: Manipulation task simulation using ROS and Gazebo. In: 2014 IEEE International Conference on Robotics and Biomimetics (ROBIO 2014), pp. 2594–2598, IEEE (2014)
Chen, J., Deng, H., Chai, W., Xiong, J., Xia, Z.: Manipulation task simulation of a soft pneumatic gripper using ros and gazebo. In: 2018 IEEE International Conference on Real-time Computing and Robotics (RCAR), pp. 378–383. IEEE (2018)
Katyal, K.D., Staley, E.W., Johannes, M.S., Wang, I.J., Reiter, A., Burlina, P.: In-hand robotic manipulation via deep reinforcement learning. In: Proceedings of the Workshop on Deep Learning for Action and Interaction, in Conjunction with Annual Conference on Neural Information Processing Systems, vol. 9. Barcelona, Spain (2016)
Diankov, R., Kuffner, J.: Openrave: A planning architecture for autonomous robotics. Robotics Institute, Pittsburgh, PA. Tech. Rep. (2008) CMU-RI-TR-08-34, 79
Kootstra, G., Popović, M., Jørgensen, J.A., Kragic, D., Petersen, H.G., Krüger, N.: VisGraB: a benchmark for vision-based grasping. Paladyn, J. Behav. Robot. 3(2), 54–62 (2012)
Chitta, S., Sucan, I., Cousins, S.: Moveit![ros topics]. IEEE Robot Autom Mag. 19(1), 18–19 (2012)
Görner, M., Haschke, R., Ritter, H., Zhang, J.: Moveit! task constructor for task-level motion planning. In: 2019 International Conference on Robotics and Automation (ICRA), pp. 190–196. IEEE (2019)
Mouret, J.B., Chatzilygeroudis, K.: 20 years of reality gap: a few thoughts about simulators in evolutionary robotics. In: Proceedings of the Genetic and Evolutionary Computation Conference Companion, pp. 1121–1124 (2017)
Koos, S., Mouret, J.-B., and Doncieux, S. Crossing the reality gap in evolutionary robotics by promoting transferable controllers, In Proceedings of the 12th annual conference on Genetic and evolutionary computation, pp. 119–126 (2010)
Martinez-Gonzalez, P., Oprea, S., Garcia-Garcia, A., Jover-Alvarez, A., Orts-Escolano, S., Garcia-Rodriguez, J.: Unrealrox: an extremely photorealistic virtual reality environment for robotics simulations and synthetic data generation. Virtual Reality. 1–18 (2019)
Collins, J., Howard, D., and Leitner, J. Quantifying the reality gap in robotic manipulation tasks, In 2019 International Conference on Robotics and Automation (ICRA), : IEEE, pp. 6706–6712 (2019)
James, S. et al. Sim-to-real via sim-to-sim: Data-efficient robotic grasping via randomized-to-canonical adaptation networks, In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 12627–12637 (2019)
Wang, L., DelPreto, J., Bhattacharyya, S., Weisz, J., and Allen, P. K. A highly-underactuated robotic hand with force and joint angle sensors, In 2011 IEEE/RSJ International Conference on Intelligent Robots and Systems,: IEEE, pp. 1380–1385 (2011)
Dollar, A.M., Howe, R.D.: Joint coupling design of underactuated hands for unstructured environments. Int J Robot Res. 30(9), 1157–1169 (2011)
Klingbeil, E., Rao, D., Carpenter, B., Ganapathi, V., Ng, A. Y., and Khatib, O. Grasping with application to an autonomous checkout robot, In 2011 IEEE International Conference on Robotics and Automation, 9–13 May 2011, pp. 2837–2844 (2011) doi: https://doi.org/10.1109/ICRA.2011.5980287
Pas, A. T. and Platt, R. Using geometry to detect grasps in 3d point clouds, arXiv preprint arXiv:1501.03100, (2015)
Zelenak, A., Brabec, C., Thompson, J., Hashem, J., Fernandez, B., Pryor, M.: Intelligent grasping with the robotic opposable thumb. Appl. Artif. Intell. 28(8), 737–750 (2014)
Kalashnikov, D. et al. Qt-opt: Scalable deep reinforcement learning for vision-based robotic manipulation, arXiv preprint arXiv:1806.10293, (2018)
Manuelli, L., Gao, W., Florence, P., and Tedrake, R. kPAM: KeyPoint Affordances for Category-Level Robotic Manipulation, arXiv preprint arXiv:1903.06684, (2019)
Levine, S., Wagener, N., and Abbeel, P. Learning contact-rich manipulation skills with guided policy search, arXiv preprint arXiv:1501.05611, (2015)
Pinto, L., Davidson, J., and Gupta, A. Supervision Via Competition: Robot Adversaries for Learning Tasks, In 2017 IEEE International Conference on Robotics and Automation (ICRA): IEEE, Pp. 1601-1608 (2017)
Dollar, A. M. and Howe, R. D., Simple, robust autonomous grasping in unstructured environments, In Proceedings 2007 IEEE International Conference on Robotics and Automation, : IEEE, pp. 4693–4700 (2007)
Dollar, A.M., Howe, R.D.: The highly adaptive SDM hand: design and performance evaluation. Int J Robot Res. 29(5), 585–597 (2010)
Morrison, D. et al. Cartman: The low-cost cartesian manipulator that won the amazon robotics challenge,” In 2018 IEEE International Conference on Robotics and Automation (ICRA) : IEEE, pp. 7757–7764 (2018)
Johns, E., Leutenegger, S., and Davison, A. J. Deep learning a grasp function for grasping under gripper pose uncertainty, In 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 9–14 Oct. 2016, pp. 4461–4468 (2016), doi: https://doi.org/10.1109/IROS.2016.7759657
Kopicki, M., Detry, R., Schmidt, F., Borst, C., Stolkin, R., and Wyatt, J. L. Learning dexterous grasps that generalise to novel objects by combining hand and contact models, In 2014 IEEE International Conference on Robotics and Automation (ICRA), May 31 2014–June 7 2014, pp. 5358–5365 (2014), doi: https://doi.org/10.1109/ICRA.2014.6907647
Saxena, A., Driemeyer, J., Ng, A.Y.: Robotic grasping of novel objects using vision. Int J Robot Res. 27(2), 157–173 (2008)
Saxena, A., Driemeyer, J., Kearns, J., and Ng, A. Y. Robotic grasping of novel objects, In Advances in neural information processing systems, pp. 1209–1216 (2007)
Microsoft, Microsoft LifeCam Studio Technical Data Sheet. [Online]. Available: http://download.microsoft.com/download/0/9/5/0952776D-7A26-40E1-80C4-76D73FC729DF/TDS_LifeCamStudio.pdf. . Accessed 28 July 2020
RS Components Ltd, Datasheet: RS Pro 2.5m White LED Strip 136–3579, n.d. [Online]. Available: https://docs.rs-online.com/256e/0900766b815e69e7.pdf. Accessed 28 July 2020
Carlo Gavazzi Holding, Datasheet: Photoelectrics Diffuse-reflective Type PA18C.D..., DC, 2017. [Online]. Available: https://docs.rs-online.com/906c/0900766b8170a64f.pdf. Accessed 05 Aug 2020
HT Sensor Technology Co. Ltd. TAL220 Parallel Beam Load Cell. https://cdn.sparkfun.com/datasheets/Sensors/ForceFlex/TAL220M4M5Update.pdf. Accessed 16 July 2019
Avia Semiconductor. HX711. 24-Bit Analog-to-Digital Converter (ADC) for Weigh Scales. https://cdn.sparkfun.com/datasheets/Sensors/ForceFlex/hx711_english.pdf. Accessed 16 July 2019
Changzhou Songyang Machinery & electronics new technic institute, High torque hybrid stepping motor specifications, 2012. [Online]. Available: https://www.pololu.com/file/0J629/SY57STH76-2804A.pdf. Accessed 06 Aug 2020
Dobot.cc. DOBOT Magician: Lightweight Intelligent Training Robotic Arm - An all-in-one STEAM Education Platform. https://www.dobot.cc/dobot-magician/product-overview.html. Accessed 17 Apr 2020
Shenzhen Yuejiang Technology Co Ltd., Dobot Magician User Guide, 2018, issue V1.5.1. [Online]. Available: https://download.dobot.cc/product-manual/dobot-magician/pdf/V1.5.1/en/Dobot-Magician-User-Guide-V1.5.1.pdf. Accessed 01 Sept 2020
Mean Well Enterprises, 60W Single Output Industrial DIN Rail Power Supply MDR-60 series, n.d. [Online]. Available: https://www.meanwell-web.com/content/files/pdfs/productPdfs/MW/Mdr-60/MDR-60-spec.pdf. Accessed 01 Sept 2020
L. Chinfa Electronics IND. CO., AMR1 SERIES, 2014. [Online]. Available: https://docs.rs-online.com/b0cf/0900766b8144d4b2.pdf. Accessed 11 Sept 2020
Leadshine Technology Co. Ltd., M542 Economical Microstepping Driver, n.d. [Online]. Available: http://www.leadshine.com/UploadFile/Down/M542d.pdf. Accessed 11 Sept 2020
Jaccard, P., Distribution de la Flore Alpine dans le Bassin des Dranses et dans quelques régions voisines. 1901, pp. 241–72
Szegedy, C., Toshev, A., Erhan, D.: Deep neural networks for object detection. Adv. Neural Inf. Proces. Syst. 2553–2561 (2013)
Sobti, A., Arora, C., and Balakrishnan, M. Object detection in real-time systems: Going beyond precision, In 2018 IEEE Winter Conference on Applications of Computer Vision (WACV): IEEE, pp. 1020–1028 (2018)
Acknowledgements
This research was funded and supported by the Richard and Mary Earle Technology Trust, the Ken and Elizabeth Powell Bursary, and the Massey University Foundation.
Data Availability / Code Availability
Related resources such as source code, datasets, networks, CAD drawings, designs, and microcontroller code are available at: https://drive.google.com/open?id=1VsEjCl6hrX3FeL9VRF-J9CzVL7JHXO15.
Funding
This research was funded and supported by the Richard and Mary Earle Technology Trust, the Ken and Elizabeth Powell Bursary, and the Massey University Foundation.
Author information
Authors and Affiliations
Contributions
Jacques Janse van Vuuren (40%). Liqiong Tang (30%). Ibrahim Al-Bahadly (15%). Khalid Mahmood Arif (15%). Please note that all authors contributed equally to the research work of which this paper is related to. However, not all authors contributed equally to the production of this paper.
Corresponding author
Ethics declarations
Conflicts of Interests/Competing Interests
This research is supported by Massey University, New Zealand. The authors declare that they have no known conflicts of interest or competing interests.
Additional information
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
van Vuuren, J.J., Tang, L., Al-Bahadly, I. et al. A Benchmarking Platform for Learning-Based Grasp Synthesis Methodologies. J Intell Robot Syst 102, 56 (2021). https://doi.org/10.1007/s10846-021-01410-5
Received:
Accepted:
Published:
DOI: https://doi.org/10.1007/s10846-021-01410-5