Graham Morrison reports on a pioneer at the heart of a revolution on the final frontier.
Space. It’s big. And the costs associated with getting large chunks of human engineered debris accelerated to escape velocity are on a similar scale. The 2010-adjusted costs of the Apollo programme, between 1959 to 1973, for example, come to approximately $109 billion dollars. And it’s astronomical costs like these that have undoubtedly helped push investment in space exploration back in various political manifestos. Our current age of austerity must surely be the final nail in the coffin for the kind of governmental sponsorship that helped get mankind to the moon.
This has had a perhaps unsurprising side-effect – the democratisation of space, whereby individuals and companies have been able to take up some of the slack and send create their own space-bound projects, or help space agencies deliver far better value for their more limited money. This is something that would have been unimaginable without the great technological leaps we’ve made over the last 50 years. To commemorate 40 years since the Apollo 11 mission in 2009, for instance, Google published the original code for the command module and the lunar module for the Apollo Guidance Computer. It’s less than 2,000 lines of assembly language.
Choosing Linux isn’t about cost. It’s about choosing the best solution for the job and not re-inventing the wheel. And this is why Linux is having a profound effect on science and space – it’s why the International Space Station switched, with the United Space Alliance being quoted as saying, “We migrated key functions from Windows to Linux because we needed an operating system that was stable and reliable” in the original article on ExtremeTech (bit.ly/1bD0UWD), and it’s why Linux is such a common component at institutions such as CERN.
But the most recent space-bound use for which we’ve seen Linux mentioned is as the operating system within an unbelievably small satellite that’s (almost) launched by astronauts throwing boxes out of the back of the International Space Station. Yes, as it hurtles across the planet some 330km above us. The project is being run by a private company called Planet Labs. It’s still not clear how this company is going to monetise its assets or its innovation but there’s obviously a well thought-out business case for all of this. It’s still too early to tell. But with that caveat out of the way, what we’ve seen so far from Planet Labs does genuinely get us excited because not only is Linux and open source at the heart of its technology, it’s also attempting to change the world for the better.
The idea is simple enough to visualise; create a large ring of satellites that stay fixed in respect to the sun while the Earth rotates beneath them. Each satellite then takes a picture of every position on the Earth every 24 hours a day. It’s a procedure that Planet Labs CEO, Will Marshall, likens to a line scanner for the planet. The satellites then beam back those images, which are processed and made accessible to everyone through an API, and the resolution is so good that you can make out individual trees. With access to data like this you can easily imagine monitoring deforestation or the shrinking ice caps, the crop yield for different forms of agriculture, or even the size and scale of opencast mining output.
Eyes in the sky
To get the kind of ubiquitous coverage needed to complete a photo cycle every 24 hours, Planet Labs is going to need more than 100 satellites in orbit. Fortunately, it’s well on its way. With the first launch of 28 satellites from the International Space Station in February 2014, it became the largest constellation of earth orbiting satellites in human history, and this was followed by more launches from the ISS and even the Russian Dnepr rocket.
We spoke to one of the founders of Planet Labs (and its CEO), Will Marshall, after he gave an excellent presentation on this very subject at this year’s OSCON in Portland. Considering the huge potential for both business and humanitarian efforts, our first question was whether both aspects to the image data would take equal precedents.
“Yes, absolutely,” he replied, “I don’t know if they take equal precedents – I would say our overriding goal is to help humanity with the data, but it’s great to have a solid business case to help to boost that.”
Planet Labs is perhaps not dissimilar to Canonical in trying to create a commercial business with an altruistic side, and Will started to tell us how the ideas behind Planet Labs began to take shape. He told us that while he’d been working at NASA, they’d been experimenting with what they now call ‘PhoneSats’. These were literally smartphones that they were putting in orbit to see if they could work. And they worked just fine.
Money makes the satellites go round
“I worked on a couple of what NASA considered small satellites with 10–200 million dollars of cost, roughly.” Will told us. “They’re not necessarily physically small, but they’re small in cost because normal satellites cost half a billion or billions of dollars.”
With the PhoneSat, the aim was to “break down psychological barriers. It’s not as hard all that. Now there’s a lot of systems complexity into putting satellites together and working with all of the ground stations and stuff, so it’s not trivial. But nevertheless, it doesn’t need to be a billion dollars.”
Like computers in the 1950s and 1960s, satellites are traditionally huge and heavy. A typical payload is 6,000kg, and that kind of weight needs the entire fairing of a rocket to make it into orbit. Not only is that expensive, it adds many different layers of complexity and organisation, which is why you find countries rather than companies sponsoring and managing their deployment. Part of the solution for Planet Labs is to borrow from the philosophy of agile development, – that’s releasing early and releasing often, taking advantage of the latest consumer technology.
So why hasn’t this methodology been adopted before? “Because technology wasn’t ready and because if was a different philosophical approach to satellites and a higher risk one in a way,” Marshall says. “We hadn’t guaranteed that the technology was going to work. It was a radically different approach. We started Planet Labs because we realised that we wanted to explore the humanitarian and commercial uses of taking imagery of the earth’s surface.”
The satellites being built at Planet Labs are tiny by comparison (only 10 x 10 x 30cm, and weighing a mere 4kg), like ants beneath the feet of elephants, which is perhaps why they could build them from the garage. The main section is an elongated rectangle containing a small telescope pointing down to a camera at the back. What’s even better is that it’s stuffed full of the latest technology, and amazingly, an x86 PC running Ubuntu. Marshall says that they chose Linux and open source because Planet Labs wanted to be able to rapidly reconfigure its OS to do the things it needed to do. We’re left guessing as to whether it’s a long-term release, but the lifespan of one of these satellites is only 1–2 years, depending on their altitude, so it might not even matter.
But what’s just as impressive is that alongside its x86 Linux PC, Planet Labs is also using copious amounts of open source both for its onboard processing and for its image processing closer to home. “Most of the image processing stack is on the ground,” Will told us, “but there is some processing on board. Most of the image processing stack on the ground uses open source software built in libraries like GRASS and GDAL and things like this – open source libraries that our employees are helping to develop.”
So does that mean that any of Planet Labs’ changes are making their way back upstream? “Absolutely. That’s our goal.”
“We want to push out whatever useful things that we do to process imagery in a massive way… [we have] a compositor that takes deep stacks of imagery, looks for ones with cloud, rejects those, takes some of the images and pulls them into something that is a coherent composite image that is the highest quality from that stack. So that’s the kind of thing that will be useful for lots of other people, that gets stuff out there and enable other people to work on it too.”
This software is the Pixel Lapse Compositor, and its lead developer, Frank Warmerdam, is already maintaining the project on GitHub (https://github.com/planetlabs/plcompositor). Frank developed and is still one of the lead maintainers of the aforementioned GDAL – the Geospatial Data Abstraction Library, a major project used by many different projects to read and write to lots of different kinds of raster geospatial data formats typically used in tracking data. If you’ve ever tracked yourself with a GPS and put the file on your Linux box, you’ll have come across one of the formats and realised that despite them all being called ‘GIS’, it’s never simple to make sense of the data that these files contain. Other open source projects used by the team include PostGIS, NGINX and OpenVC, and another team member, Jesse Andrews, is one of the lead developers of OpenStack.
This is just the beginning of the deployment and testing phase, and the crux of the project’s success, at least from our perspective, depends on how the team licence their data and how freely projects will be able to access that data.
“We will enable anyone to access the data via the developer API, says Marshall. “We’ll talk more about the product when we get ready to launch it, but we intend it to be in that spirit.” That’s great news, and it means that hackers will be able to get their hands on some dramatically up-to-date earth imagery.
On 19 August 19 2014, Planet Labs licensed its early imagery under Creative Commons Attribution Share Alike 4.0, and while this only includes the images that can currently be found within the company’s hosted gallery, it would be wonderful if a licence like this could eventually used for the image data obtained through the eventual API. It’s obvious that there are hundreds of applications for this data and even with the inclusion of commercial interest, there will always far more potential with an open interface. The thought of an open source project being able to run its own algorithms against the data set – whether it’s someone tracking queueing traffic on the Suez canal or the water levels in reservoirs, or anything else that the collective imagination can come up with, is a wonderful one.