Maintaining Subsystems

How to maintain subsystems within the main codebase.

This page details how to maintain various subsystems within the main codebase.

Vision

The accuracy of the vision system is reliant on the accuracy of odometry and kinematics because they affect the placement of the mesh and green horizon. It is important that these systems work reasonable well otherwise the robot may have issues detecting objects.

Dataset Generation

Synthetic and semi-synthetic training data for vision can be generated using NUpbr. Pre-generated datasets for training the Visual Mesh are on the NAS in the lab.

NUpbr

NUpbr is a Physically Based Rendering tool created in Blender. It creates semi-synthetic images with corresponding segmentation masks for training.

Find out how to get and use NUpbr on the NUpbr NUbook page and the NUpbr GitHub repository.

Setting Up The Data

The Visual Mesh requires raw images, segmentation masks and metadata, as outlined on the Quick Start Guide. NUpbr can provide all of these as output, and premade data is available on the NAS. The data then needs to be converted to the tfrecord format using a script on the Visual Mesh repository. The Quick Start Guide describes how to use it.

The Visual Mesh

Training and Testing

Go to the NUbook Visual Mesh Getting Started guide to find out how to train and test a network, with an example dummy dataset.

Exporting Configuration

The resulting network should be exported to a yaml file and added to the NUbots codebase, by completing the following steps.

  1. Create a base configuration file. Example yaml files can be found in the Visual Mesh repository and in the NUbots repository.

  2. Export the weights of your trained Mesh to this configuration file using the following command, where <output_dir> is the directory of the configuration file:

    ./mesh.py export <output_dir>
  3. Add this configuration file to the NUbots repository in the VisualMesh module. Replace or add a configuration file depending on the use case of the Mesh - RobocupNetwork.yaml is for soccer playing on the real robot and WebotsNetwork is for soccer playing in the Webots simulator. View the Git Guide for information on using Git and submitting this change in a pull request.

Camera Calibration

The vision system cannot work optimally if the cameras are not calibrated correctly. The input page describes the camera parameters that can be calibrated.

An automatic camera calibration tool is available in the NUbots repository. See the camera calibration guide to find out how to use this tool.

Testing

After updating the Visual Mesh in the NUbots repository, it should be tested before merging. Refer to the Getting Started guide for assistance for the following steps.

  1. Build the code, ensuring ROLE_visualmesh is set to ON in ./b configure -i, and install it to the robot. Ensure the new configuration file is installed by using the -cu or -co options when installing - check out the Build System page to find out more about options when installing onto the robot.

  2. When your new Visual Mesh is installed onto the robot, connect to the robot using:

    ssh nubots@<address>
  3. Ensure NUsight is on:

    nano config/NetworkForwarder.yaml

    Turn vision object and compressed images on. Run NUsight using yarn prod and navigate to the NUsight page in your browser. More on NUsight can be found on the NUsight NUbook page.

  4. Run the visualmesh role

    ./visualmesh
  5. Wait for the cameras to load and then watch the Vision tab in NUsight. To determine if the output is correct, consult the vision page for images of the expected output.

If you would like to see the Visual Mesh output in NUsight, you will need to log the data and run it back in NUsight using DataPlayback, since the data is too large to send over a network. Use the steps in the DataLogging and DataPlayback guide to record and playback data. Adjust the instructions for our purpose using the following hints:

  • In step 1 of Recording Data, use the visualmesh role to record the data.
  • In step 2 of Recording Data and step 4 of Playing Back Data, set message.output.CompressedImage to true and add message.vision.VisualMesh: true in both `DataLogging.yaml and `DataPlayback.yaml.
  • In steps 1, 2 and 5 of Playing Back Data, use the playback role to playback the data, without changes.

Tuning Detectors

Potentially, the Visual Mesh had positive results after training, but when used on a robot it performed poorly. In this case, the detectors may need tuning.

BallDetector.yaml and GoalDetector.yaml contain the values for tuning the ball and goal detectors respectively.

  1. Build and install the visualmesh role to a robot.

  2. SSH onto the robot.

  3. Enable NUsight messages on the robot by running

    nano config/NetworkForwarder.yaml

    and set message.vision.Balls and message.vision.Goals to true.

  4. Run NUsight using yarn prod on a computer. Set up NUsight using the Getting Started page if necessary.

  5. Run ./visualmesh on the robot.

  6. Alter the configuration file for the detectors while simultaneously running the binary on the robot. In a new terminal, SSH onto the robot again and run:

    nano config/BallDetector.yaml

    Change the values and upon saving, the changes will be used immediately by the robot without needing to rebuild or rerun the ./visualmesh binary.

  7. Repeat #6 for the goal detector by running

    nano config/GoalDetector.yaml

In general, it might be useful to adjust the confidence_threshold on both detectors to improve the results. Other variables may give better results with different values, except for log_level and the covariances (goal_projection_covariance and ball_angular_cov).

Copyright © 2021 NUbots - CC-BY-4.0
Deploys by Netlify