1
Jun

Better Bombing with Machine Learning

If you haven’t noticed that Machine Learning, or Artificial Intelligence depending on your particular project, has come for OpenStreetMap, then you’ve spent the last couple of years under a rock.

Practically every major business wanting to prove their IT engineering salt have come up with some kind of project that uses Machine Learning to process aerial imagery, and OpenStreetMap is an integral part in many of these experiments – at the very least, OpenStreetMap data is used for training the algorithms, but frequently the results are also made available to OpenStreetMap, with the hope that our community will provide valuable feedback that will further improve the machines, or provide some ethical legitimacy (“we’re doing this for good!”).

Working with organisations that use OpenStreetMap for humanitarian purposes, the purveyors of such machine-made data will often generate favourable headlines, about how their algorithm-detected building outlines helped in the aftermath of a flood here, or how their machines were able to distinguish schools from other buildings with so-and-so accuracy and thereby save lives there.

In my view, what we’re dealing here is clearly military technology. Being able to detect buildings from the air, to perhaps even trace roads, power grid, and communications lines, to automatically distinguish hospitals from schools from government buildings, are essential ingredients of future bombing technology. This is every general’s wet dream. There is absolutely no question in my mind that the algorithms that are today built and trained with OpenStreetMap will soon help guide bombs and missiles – be that to avoid the schools or to target them on purpose, that’s not going to be the algorithm’s concern.

Now every technology can be used by the military, and not everything the military does is about killing people. But nothing we, in OpenStreetMap, have ever collaborated in had such a direct link to bombing as the automated evaluation of aerial imagery. I am taken aback at the utter naïveté with which humanitarian organisations partner with the purveyors of machine learning. What I’m seeing is a bunch of engineers happily building the moral equivalent of the next nuclear bomb, at best not thinking about possible consequences, at worst being directed to ignore possible consequences for financial gain. I find it disingenuous to claim that you’re developing some sort of machine learning thing to aid humanitarian purposes. No you’re not – you’re abusing some humanitarian project as a fig leaf for your military research.

Now we at OpenStreetMap have no influence over what our data is used for – the open license does not allow us to discriminate against any field of endeavour. (Machines that have been trained with our share-alike data should fall under the share-alike provisions of the license but that’s a point for a separate discussion.)

What we can control is just how jubilantly we welcome the results of this military research. I think we should be very skeptical when people reach out to us and offer us any form of cooperation that deals with automatic processing of aerial imagery. Before we applaud their efforts, give them a platform to whitewash their research with a humanitarian fig-leaf, or even participate in training their machines by adding their data to our database, we should ask very tough questions. We should ask if the business in question is aware of the dual use of these algorithms, and what ethical guidelines are in place to ensure that “humanitarian” work done in and with OSM is not actively contributing to the creation of better bombing bots.

OpenStreetMap has never been an automatic image recognition project. There is innocence in having individual human beings trace their neighbourhood buildings from aerial imagery. This approach works but it works slowly, and hence has opened us to seduction by the purveyors of weaponizable automatic algorithms.

Let us be aware that every time we allow automatically traced data into our database, we’re complicit in someone, somewhere, building the better killing machine.

There's 0 Comment So Far

Comments are closed on this post.