What could be wrong with a self-styled environmentalist satellite company, called Planet, with the motto, “Using space to help life on Earth”? According to a recent article in PacificStandard, Planet has launched nearly 300 of its mini-satellites, each about the size of a loaf of bread, to image the entirety of the Earth’s landmass every 24 hours. “It’s an impressive feat with enormous implications for environmental research and climate monitoring,” the article’s author writes.
However, she also interviews Timothy Edgar, a former intelligence official and privacy lawyer who worked on related issues at the American Civil Liberties Union and inside the Pentagon, and who now teaches cybersecurity for Brown University’s Executive Master in Cybersecurity program and Watson Institute for International and Public Affairs. He reminds us that projects with good intentions should not be exempt from rigorous privacy standards. The article summarizes Edgar’s thoughts in the following:
Edgar… says that, when it comes to privacy concerns around this kind of imaging, ‘It’s not just about whether you can pick out a face. You can interpret a lot through context. If you can see people at all, if you can see cars—and if you can do that on a daily basis—you could see if somebody is on vacation or not.
Even if Planet’s satellites are relatively low-resolution compared to larger, more traditional models, Edgar notes that this ‘is the way the intelligence community has always used satellite imagery. You study what cars are parked in the parking lot; that can reveal an enormous amount of information. It’s used to plan military attacks. It could be used for corporate espionage.’ What about somebody going through a bad break-up setting up a daily search for an ex’s house, he asks? What about a government tracking rebel troop movements, or even the movements of a particular ethnic group? Is Planet prepared for the human rights implications of that?
The key to preventing such data abuses, in Edgar’s view, lies in the processes the company has in place for enforcing its values and auditing its customers’ data use—including tracking customers that might be misrepresenting the ways they’re using data or manipulating it to show something it doesn’t. ‘When you have those kinds of capabilities, the worst thing you can do is say, don’t be evil and think that’s good enough’ Edgar says.
To read the rest of the article, go here.