FTR: Field Test Rover¹
/fiːld tɛst ˈrəʊvə/
- Four wheeled rover-like platform provided with multiple sensors representative sensors able to withstand full nights of tests in moon analogues with the purpose of demonstrate and validate the Lunar Prospector Rover mission.
- Good old LRM (Lunar Rover Model), keeping me busy since 2013.
This post is about the capabilities on the FTR, robotic platform that is being tested at the moment in the Earth analogue of Minas de San José. After a challenging year we have a rover that can talk to a Ground Control Station with a soft real-time configurable delay, receive and execute plans autonomously, move with four different manoeuvres, process and send images from six different cameras at the same time, give a localisation solution with five different techniques plus ground truth, produce Digital Elevation Maps and pointclouds from seven different sensors, point the sensors at the top of the first mast with its PTU and, never forget, switch on and off the lamps that go around the FTR.
But let’s go subsystem by subsystem.
First things first, if you want to explore some moons, you need to get your rover to move nice and gracefully as a balloon. The locomotion subsystem is the most critical part of a rover, because without it, the FTR would be the FTL (Field Test Lander) and we are not getting paid for that.
In order to address this critical point, FTR has a very special guest inside: the RTCC. A LEON3 FPGA board that communicates with ELMO drives to move the 8 motors and with the OBCs on the other side. Through this board, the locomotion module is able to perform the following commands:
- LOMA (angle, speed): Ackermann motion
- LOMC (angle, speed): Crab motion
- LORO (speed): point turn motion
- LOST: locomotion stop
- LOHO: locomotion homing position
- LOMT (x, y, yaw, speed): locomotion move to
- LOMY (x,y, speed): locomotion move to without specified yaw ending position
- LORT (angle, speed): point turn to a given angle
- LODC (angle, speed, distance): distance crab, move with a given angle and speed for the given distance.
One of the things studied under the LUCID activity is: what is the impact of the localisation error in the performance of the operator. Moreover, we want to test how these techniques perform in the analogue for different traverses. So we have provided the FTR with the ability of giving a localisation solution (simultaneously) with five localisation techniques:
- LOC1: inertial sensors and wheel encoders
- LOC2: pointcloud matching from velodyne
- LOC3: monocular visual odometry
- LOC4: XB3 stereo visual odometry
- LOC5: BB2 stereo visual odometry
LOC1 relays on how much the wheels are moving, so slipping in the loose soil does not help at all. LOC2 works in full darkness, so it might be the solution to enter permanently shadowed areas. The visual techniques are a great localisation technique but you are tied to your camera calibration and finding features. Probably the solution is a combination of all of them but, when to use which? we will answer this question after the LUCID field test.
Sending images is a vital part of any exploration mission, but they are also delicate and data consuming systems that bring a lot of headaches to engineers. What will be the camera system of your rover? Where would you mount it and why? Will you go with stereo, spherical camera, infrared cameras, which lenses..?
So in order to support future exploration missions we have loaded up the FTR with a bunch of cameras, each of them are chosen and placed very carefully with a lot of thought behind it.
- Ladybug 360º camera: gives you a spherical images of all the surroundings of the FTR. Good for getting immediate information of the vicinity around the rover.
- Bumblebee XB3 stereocamera: mounted on a PTU unit, it can give you depth information using a combination of two of its three eyes. It can be pointed towards the area you want to reconstruct.
- Stereo camera BB2: mounted on a fixed support on top of the FTR deck, its two cameras can also give you depth information. Great for stereo visual odometry.
- Hazcams: with a fish eye lens, their purpose is to see any hazard coming close to the wheels, such a stone or crevasse.
Nothing to say here, lamps are awesome and kids love them (fact). We even implemented a disco mode.
This post is part of a series on the capabilities of the field test rover. Second part coming soon.