Here's the talk summary:
The control of the Mars Exploration Rovers (MER) requires a complex set of coordinated activites by a team. Early in the MER mission the author automated in Python much of the task of one of the operation positions, the Payload Uplink Lead, for 7 of the 9 cameras on each rover. This talk describes the MER rovers, the operation tasks and that implemented system.They used gigapan images.
They use virtual reality to visualize what's going on.
Dust was a serious problem for the rovers.
There's lots of Python
The speaker's background is in machine learning and robotics.
The rovers have been running for 6-7 years. They find 1-2 bugs a year. Bugs are usually fixed in a matter of hours.
They uses Ames Vision Workbench, Nebula, and OpenStack. All three of these are open source.
The speaker was from Ames Research Center, NASA.
Side note: unfortunately, I missed the first five minutes.