RoboCup HL-VS Workshop: Team Talk by NUbots
Table of Contents
- Introduction
- Team Members
- Acknowledgment of Country
- About the New Bots
- Hardware Used
- Sensors Used
- Software Framework: Nuclear
- Odometry and Localization
- Vision System
- Motion System
- New Sites: Visual Debugging Tool
- Vision Data Tool
- Walk Optimization Tool
- Experience with Webots
- FAQ
Introduction
The New Bots team from the University of Newcastle in Australia has been actively participating in the RoboCup since 2002. In this article, we will take a closer look at their team, their hardware and software systems, as well as the tools they have developed to enhance their performance in the competition. From the use of the Nuclear software framework to the implementation of a visual debugging tool called New Sites, the New Bots have a lot to offer in terms of cutting-edge robotic technology. So let's dive in and explore the world of the New Bots!
Team Members
The New Bots team consists of dedicated and talented individuals who have contributed significantly to their success over the years. Here are the Current team members:
- Isabelle: Speaker for the team presentation
- Kipp: Former team member
- Uzziah: Former team member
- Aaron: Team mentor
- Josephus: Team mentor
- Trent: Team mentor
- Alex: Team mentor
Acknowledgment of Country
Before delving into the technical details, the New Bots team takes a moment to acknowledge and pay respect to the First Nations people of Australia. They recognize the Oracle people and their continued connection to the land, skies, and waterways. The team also acknowledges the Darkinjung people and their ownership of the land where their Central Coast campus is located. Additionally, they extend their respect to the neighboring nations around the Orbital nation.
About the New Bots
The New Bots team hails from the University of Newcastle in Australia and has been an active participant in the RoboCup since 2002. Their team handbook, called "New Book," provides comprehensive information about the team, including their systems and guides. They have also made all their code open-source on GitHub under the organization "New Bots."
Hardware Used
The New Bots team utilizes the EyeConsume Open Platform as their hardware base. They have made some modifications to enhance its usability, including changes in materials and sizing. The robots are equipped with stereo vision and fisheye lenses for an extensive field of view. They also employ Dynamixel motors in the legs, arms, and neck, along with the Darwin sub-controller.
Pros:
- The EyeConsume Open Platform allows for easy modification and customization.
- The use of stereo vision and fisheye lenses enhances Perception capabilities.
Cons:
- The modifications made to the hardware may introduce compatibility issues.
Sensors Used
To Gather precise data, the New Bots team employs various sensors in their robots. They use stereo vision for perception, although it is not fully integrated into their software systems yet. Additionally, they utilize fisheye lenses to achieve a 180-degree field of view. The robots are equipped with accelerometers, gyroscopes, and encoders, which are crucial for odometry and motion control.
Software Framework: Nuclear
The New Bots team utilizes the Nuclear software framework to power their robots. Developed in C++, Nuclear is designed to be fast, modular, and easy to use. It provides a high level of portability and features efficient code messaging. The framework was created with the philosophy of being easy to use correctly and hard to use incorrectly. The team leverages Nuclear's multi-threaded system and powerful messaging capabilities to handle concurrent tasks and communication between modules.
Odometry and Localization
To accurately determine the robot's position and orientation, the New Bots team uses a combination of odometry and localization systems. They employ a dead reckoning approach to odometry, which involves estimating the robot's motion Based on accelerometer, gyroscope, and kinematic data. The odometry information is fused using an unscented Kalman filter. The localization system utilizes a particle filter and receives goal positions from vision processing. This information is fed into the visual mesh algorithm, which identifies objects and allows for robust localization.
Vision System
The New Bots team relies on a sophisticated vision system to perceive the environment and detect objects. They utilize a visual mesh algorithm that analyzes the visual input from the cameras to identify objects such as goals and balls. The vision system takes into account the robot's odometry information and uses heuristics to determine the likelihood of objects in the field. The team has implemented clever techniques such as clustering field points to detect the green horizon and reduce computational load. The visual mesh algorithm, combined with deep learning techniques, enables accurate object detection and segmentation.
Motion System
The New Bots team has developed a motion system that allows their robots to perform various actions such as walking, kicking, and getting up. They primarily use keyframe animations for most motions, including getting up and kicking. The walk engine, based on the Quintic walk, provides stable and efficient locomotion. The team has also worked on walk optimization using a multi-objective genetic algorithm. This optimization tool has been instrumental in fine-tuning the walk engine for different motions, improving stability and performance.
New Sites: Visual Debugging Tool
The New Bots team has developed a visual debugging tool called New Sites. This web-based tool provides intuitive and informative visualizations of the robot's state and performance. It offers various views, including charts, to help analyze and debug different aspects of the robot's behavior. New Sites follows the principles of simplicity, UDP messaging, declarative data-driven rendering, resemblance to the real world, and user control over visualization. This tool has proven to be invaluable in understanding the robot's behavior and identifying areas for improvement.
Vision Data Tool
The New Bots team has created a vision data tool that allows them to Collect and analyze vision data for training and testing their algorithms. This tool randomizes the background and ball textures in each image, making the dataset more diverse and robust. It collects metadata such as lens parameters and odometry information to facilitate accurate vision processing. The vision data tool has been instrumental in training the visual mesh algorithm and improving object detection and localization.
Walk Optimization Tool
The New Bots team has developed a walk optimization tool that assists in fine-tuning the walk engine for their robots. This tool employs a multi-objective genetic algorithm to find optimal parameters for walking motions. By running multiple generations and individuals, the tool seeks to improve stability and performance in different walking scenarios. The walk optimization tool has been successful in optimizing backward walking motion and holds promise for enhancing other locomotion Patterns.
Experience with Webots
The New Bots team recently transitioned from using Gazebo to Webots for their simulation needs. While the initial integration with Webots presented some challenges, the team found it to be a more user-friendly and efficient platform for testing their systems. The availability of the official RoboCup world model in Webots facilitated the migration process. The New Bots team suggests that the plugin and connection setup in Webots are easier to work with compared to Gazebo.
FAQ
Q: Are the New Bots' code and tools open source?
A: Yes, the New Bots team has made their code open source, and their organization "New Bots" can be found on GitHub. The tools developed by the team, including New Sites and the Vision Data Tool, are available for public use and contribution.
Q: How does the New Bots team handle errors and debugging in their software systems?
A: The team relies on their visual debugging tool, New Sites, to analyze and debug their software systems. New Sites provides intuitive visualizations and informative charts that help identify errors and performance issues in real-time. It allows the team to have full control over the visualization and focuses on simplicity to ensure reliability.
Q: What is the main advantage of using the Nuclear software framework?
A: The Nuclear software framework offers high modularity, quick execution, and ease of code messaging. Its multi-threaded system simplifies the handling of concurrent tasks, allowing for efficient communication between modules. The framework's philosophy of being easy to use correctly and hard to use incorrectly enhances its usability and reliability.
Q: How does the New Bots team optimize their walk engine?
A: The New Bots team utilizes a multi-objective genetic algorithm in their walk optimization tool. This tool allows them to explore different parameters and fine-tune the walk engine for stability and performance. By running multiple generations and individuals, the team can find optimal solutions for various walking motions, ensuring efficient locomotion on the field.
Q: What modifications have the New Bots made to the EyeConsume Open Platform?
A: The New Bots team has made several changes to the EyeConsume Open Platform to improve usability and adapt it to their specific needs. They have modified the materials used in 3D printing and made adjustments to the robot's sizing. These changes aim to enhance the performance and robustness of the robots in the competition.
Q: How does the New Bots team handle object detection and localization in their vision system?
A: The New Bots team utilizes a visual mesh algorithm for object detection and localization. This algorithm analyzes the visual input from the cameras and combines it with odometry data to determine the position and characteristics of objects such as goals and balls. The team employs clever heuristics and deep learning techniques to accurately identify and segment objects in the environment.
Q: How does the New Bots team validate their algorithms and systems in the real world?
A: The New Bots team collects data from the real robot using their vision data tool. This allows them to test and validate their algorithms by comparing the real-world data with the ground truth information. The team leverages motion capture systems to obtain precise and accurate data for comparison and performance evaluation.
Q: Can the New Bots team's tools be used with other robotic platforms and competitions?
A: The New Bots team's tools, such as New Sites and the Walk Optimization Tool, are designed to be versatile and adaptable. While they have been developed specifically for the RoboCup competition and their hardware setup, they can potentially be modified and used with other robotic platforms and competitions.