I’ve blogged many times about the state of the Internet of Too Many Things, AKA, The Internet of Broken Things. Consumer IT is poorly built and many times, simply shoddy. (I tend to be offended by stuff that is built no better than what we were doing in university labs in the 80s. We know better, and should be doing better.)
If only this were just a problem for consumer gadgets.
Everyone knows that military weapons are more and more automated, incorporating high tech and cleaver IT through and through. Military uses have long driven the development of digital systems, indeed, the microcomputer and internet were enabled by reusing technology developed for military uses.
Of course, real weapons systems have many special requirements, and are developed and tested differently than civilian systems. Among other things, military systems are dangerous—lethal, in fact—and always under threat from adversaries. Hackers screwing with my phone is a problem, hackers screwing with a war machine is disastrous.
So we hope that developers are creating really solid, unhackable systems, no?
This fall, the US GAO (investigative arm of the US Congress) released a not very reassuring report on “WEAPON SYSTEMS CYBER SECURITY”. The subtitle tells the story:
This isn’t about leaks or spying, this is about the actual machinery itself, which, of course, is built out of computers, networks, and—uh, oh—software.
This unclassified report makes some pretty worrying claims. There are scary war stories about tests that resulted in (friendly) hackers messing with, disabling, or take control of weapons. Given the limited nature of the tests, these results can only indicate that any serious adversary could do the same. That’s just about as bad as it can get.
I’m not happy to hear that some of the problems come from the use of “civilian” software, and “civilian-grade” errors—such as using open source software and not resetting the default password. I mean, c’mon! I’d sack a grad student that did that, let alone a bazillion dollar defense contractor.
Less spectacular perhaps, but even worse news are the reports that some systems can’t even be tested because they are too complex and/or impossible to test. I strongly suspect that many aspects of these systems can’t be tested because nobody understands them. Heck, I don’t understand my phone, so I’m not surprised if some fancy system isn’t really understood.
And I have to note that how ever any single system might have been tested, it is just as important to test integrated systems. These complex machines are connected and communicating, so even a minor flaw in one may open a dangerous problem in other systems. No amount of unit testing can reveal system and interaction errors.
The DOD has additional challenges, not least serious limits on information sharing. (They do have fancy acronyms and project names, though.)
The report concludes:
“Program Offices May Have False Sense of Confidence in the Security of Their Programs” p. 27
Given that this report was unclassified and cleared for release, I can’t help but wonder what wasn’t reported. And given the prowess of the US DOD, we have to assume that other countries may have similar problems. (That’s not a comforting thought.)
I hope and assume that there are serious efforts underway to beef up both the design and testing of these systems. The report sketches some initiatives, but it sounds pretty inadequate.
I could imagine that the new Cyber Command might have some responsibility for defensive testing. Who better to help bolster our own systems.
I will also suggest that developers should rotate through a shift as adversarial testers, and then back to developing.
One company I worked for did something like that. All the software engineers rotated through three groups, “developers”, “documenters”, and “testers”. This was a really valuable idea, it gave perspective and respect, and helped produce better software in the first place. (By the way, being on the “tester” team was fun. Basically, the “developers” bet you that there are no bugs, and the “testes” bet there are bugs. Testers always win the bet.)
- United States Government Accountability Office, WEAPON SYSTEMS CYBER SECURITY: DOD Just Beginning to Grapple with Scale of Vulnerabilities. United States Government Accountability Office Report to the Committee on Armed Services, U.S. Senate GAO-19-128, 2018. https://www.gao.gov/assets/700/694913.pdf