IEEE's prescription for med-tech crowd: preventing hacks is better than a cure
Take these coding standards and, if pain persists, consult your doctor
Medical devices shouldn't be hackable, so the IEEE has published the first steps towards laying down decent security practise for the sector.
Working with the IEEE's Cybersecurity Initiative, a group of researchers has laid down both a set of recommendations for current practise, as well as research priorities for medical technology. The paper is a summary of a two-day workshop held last November.
Building Code for Medical Device Software Security (PDF) includes the sensible – and to regular readers of El Reg, obvious – recommendation that proprietary crypto implementations are a bad idea.
“Cryptographic algorithms that resist serious analysis are notoriously difficult to invent and to program correctly”, the paper states, so “externally developed and certified implementations should be sought; custom implementations of cryptographic components require careful vetting by experts.”
Developers are also urged to use memory-safe languages, so as to avoid bugs like buffer overflows, null pointers, use-after-free, uninitialised memory use, and illegal free errors.
The document also notes that medical devices need to provide a “tamper-resistant audit trail” for security events such as software installation and user authentication.
If the device-maker believes a piece of equipment needs a hard-coded key for some reason – something that's almost always a bad idea – the paper says it should be protected against tampering or observation.
Designers also need to manage privileges, so that processes always operate with the lowest possible OS-level privilege. Other recommendations include digitally-signed firmware, and decent security logging.
The paper was authored by Tom Haigh and Carl Landwehr, with workshop contributions from another 38 participants including NIST, the NSF, Microsoft, Siemens Healthcare, Philips, and a bunch of other companies and universities. ®