It's alive! We mounted the Nvidia Jetson TK2 board which is going to be our brain and later running the deep learning algorithms. The motor controller wiring has also been finished with the help as a 3D printed bracket and the system power wired up. The system is powered by a LiPo battery, the same as used drones. A 360 degree single plane lidar was also installed again with a 3d printed bracket. After the expected minor technical difficulties, we set up the ROS (Robotic Operating System) environment and are able to control the vehicle through remote control. Time to terrorise the office and deliver some coffees while we're at it.
The next step is to dive deeper into the navigation cognition. How will it deliver a package? We will use a combination of GPS, camera, lidar and likely radar down the line to get Hugo from point A to B. Our deep learning team here may get involved soon to assist with camera feed-based navigation.