
Research Overview
Robot soccer is a highly appealing and accessible way to engage a broad audience in the complex technical challenges of autonomous robotics. For students it provides a competitive, challenging, and highly motivating environment for developing research and development skills in the field. Importantly, unlike many research contexts, it requires a strong blend of applied and theoretical skills to create an integrated system that works well in a real environment rather than subsystems that only work well in isolation, in simulation, on standard data sets, or in the research laboratory.
Research innovations and advanced education in robot soccer will contribute to autonomous robotics in other domains also. We expect that autonomous robots will be used to support smart and healthy ageing, to aid rehabilitation (robots are good with repetitive behaviour and smart robots can improve compliance and outcomes), to assist people in physical work, and to perform emergency/disaster area operations (e.g. investigating the status of the Fukishima nuclear plant in Japan) among many other applications.
Challenges
The research challenges of robot soccer include real-time control, decision making in dynamic environments using incomplete information, and interpreting noisy non-symbolic sensor readings. Additional challenges include development of robust intelligent behaviour, distributed control among multiple robot agents, and enabling the robots to collaborate and interact as needed.
We particularly specialize in the following areas:
- Efficient embedded and edge AI
- Individual and multi-agent behaviour
- Computer vision
- Mapping and localisation
- Software for autonomous systems

Publications
ZhengBai Yao, Will Douglas, Simon O’Keeffe, and Rudi Villing. Faster YOLO-LITE: Faster Object Detection on Robot and Edge Devices. In RoboCup 2021: Robot World Cup XXIV, Lecture Notes in Artificial Intelligence. Springer (to appear).
Simon O’Keeffe and Rudi Villing. A lightweight region proposal network for task specific applications. In Proc. 30th Irish Signals and Systems Conference (ISSC), June 2019. Maynooth, Ireland.
Simon O’Keeffe and Rudi Villing. Evaluating pruned object detection networks for real-time robot vision. In Proc. IEEE International Conference on Autonomous Robot Systems and Competitions (ICARSC), April 2018. Torres Vedras, Portugal.
Simon O’Keeffe and Rudi Villing. Evaluating quantized convolutional neural networks for embedded systems. In Proc. Irish Machine Vision and Image Processing Conference, September 2017. Maynooth, Ireland.
Simon O’Keeffe and Rudi Villing. A benchmark data set and evaluation of deep learning architectures for ball detection in the robocup spl. In RoboCup 2017: Robot World Cup XXI, Lecture Notes in Artificial Intelligence. 2017.
Simon O’Keeffe, TomasE. Ward, and Rudi Villing. Improving task performance through high level shared control of multiple robots with a context aware human-robot interface. In Proc. IEEE International Conference on Autonomous Robot Systems and Competitions (ICARSC), May 2016. Braganca, Portugal.
Thomas Whelan, Sonja Stüdli, John McDonald, and RichardH. Middleton. Efficient localization for robot soccer using pattern matching. In Leveraging Applications of Formal Methods, Verification, and Validation, Communications in Computer and Information Science. 2012.
Thomas Whelan, Sonja Stüdli, John McDonald, and Richard Middleton. Line point registration: A technique for enhancing robot localization in a soccer environment. RoboCup 2011: Robot Soccer World Cup XV, pages 258–269, 2012.
M. Quinlan and R. Middleton. Multiple model kalman filters: A localization technique for robocup soccer, in Proc. RoboCup Symposium, July 2009, Graz
W. Inam. Particle filter based self-localization using visual landmarks and image database, in Proc. IEEE International Symposium on Computational Intelligance in Robotics and Automation, December 2009. Daejeon, South Korea, pp. 246 – 251
Public data sets
SPL Object Detection Dataset v2 (4 object classes, 4908 images, 14664 object instances)
Please cite: “ZhengBai Yao, Will Douglas, Simon O’Keeffe, and Rudi Villing. Faster YOLO-LITE: Faster Object Detection on Robot and Edge Devices. In RoboCup 2021: Robot World Cup XXIV, Lecture Notes in Artificial Intelligence. Springer (to appear).”
Ball classification data set (6564 images, 5209 ball patches, 10924 non-ball patches)
Please cite: “Simon O’Keeffe and Rudi Villing. A benchmark data set and evaluation of deep learning architectures for ball detection in the robocup spl. In RoboCup 2017: Robot World Cup XXI, Lecture Notes in Artificial Intelligence. 2017.”