A quick look at testing and validating localisation and path planning within the StreetDrone ONE
This video below shows the replay of a pre-recorded LiDAR point cloud of the Radcliffe Camera, Oxford, within our visualisation tool, rViz. In this example, we were testing the accuracy of our new URDF* Vehicle Model for future path planning and trajectory simulations.
Using Normal Distributions Transform (NDT), we are extracting the map of a previously unknown environment, from the filtered pointcloud collected using a Velodyne LiDAR Puck( VLP-16). This map is being used to localise the vehicle within the environment.
The video also demonstrates the waypoints extraction capabilities of our AI Software Stack, using a collection of ROS** nodes provided by open source self-driving stack, Autoware. For trajectory generation, we compute the direction angle of the vehicle, start/end points, distance of the points, next branching lane and radius, finding the closest lane to follow and its neighbours based on the waypoints. These are being used as inputs to successfully control the acceleration, deceleration and curvature behaviour of the vehicle. Before outputting the commands to control the drive-by-wire of vehicle, we check whether the curvature is valid based on the minimum look-ahead distance.
This process provides a fundamental part of our approach to testing and validating localisation and path planning within the StreetDrone ONE research platform.
*Universal Robotic Description Format (URDF) is a file format used in ROS** to describe all elements of a robotic device.
**ROS (Robot Operating System) is a robotics middleware
Become a Member
Enter our draw for a chance to test drive the StreetDrone ONE and spend a day with the SD team! Additionally, get acces to exclusive technical content, discounted events and conferences