This article is more than 1 year old

Legal disassembly

Black hat, white hat

Does this mean that Lynn admits that his disassembling the source code was unlawful, or that, in the future he will be free to continue to disassemble Cisco source code, as long as he does not do so unlawfully? This raises the ultimate question - is reverse engineering for security purposes where an EULA restricts reverse engineering unlawful?

Some case history on code disassembly

In two cases, Atari Games Corp. v. Nintendo of America, Inc. and Sega Enterprises v. Accolade, Inc., courts addressed whether video game producers could legally reverse engineer software contained in computer game consoles, to decipher the unprotected security codes necessary for game cartridges to operate on the consoles. Both courts essentially held that reverse engineering itself (provided the software was obtained lawfully) was a lawfully permitted "fair use" of the code under copyright law. But that is just copyright law.

In 2003, a federal court in the Bowers v. Baystate Technologies, Inc. case decided as a matter of contract law that the "no reverse engineering" provisions in a "shrinkwrap" EULA were truly enforceable. In other words, while you have a right to reverse engineer software, you can give up that right by entering into a contract with the software manufacturer (the copyright holder). You enter into the contract by obtaining the software. Some right. What is worse is that, despite the fact that the decompiling is fair use and therefore not a copyright violation, if the EULA says you can't do it and then you do, you violate your license agreement - and therefore it is a copyright violation. To add another twist, the Cisco EULA "contract" says that you agree not to decompile unless you are entitled to do so under law.

Now, there are lots of reasons why you would want to decompile or reverse engineer software, some are perfectly valid reasons and some are less so. Compatibility or integrity are perfectly valid reasons to have the need to decompile parts or all of the source code, and the Cisco EULA expressly provides a process for obtaining certain code necessary for compatibility - with the consent of Cisco. If you want to decompile code to commit "piracy," or another infringing purpose, this is not something the law should encourage. Certainly there are strong public policy reasons to allow responsible security researchers to pick apart the code to find vulnerabilities.

Hackers and true "black hats" will not be deterred from the terms of the EULA, however - let's face it, they probably never bought the code in the first place and never entered into the clickwrap EULA. Thus, the "bad guys" will pick apart the code to discover and exploit the vulnerabilities. The implications of the Lynn case are that you must first ask permission of the software vendor to pick apart the code. I can imagine vendors granting permission with restrictions, whereby you can reverse engineer the code to find vulnerabilities, provided that you agree to tell US about them - and never ever tell anyone else. In other words, they get the opportunity to fix the code, and nobody is the wiser.

All contracts, including those in an EULA, are subject to being declared void if they are "against public policy." There is a strong public policy goal in encouraging responsible discovery and disclosure of security vulnerabilities (though I express no opinion here about whether Lynn's disclosure was responsible or not). Where the terms of an EULA prevent such responsible disclosure, and therefore makes the world less secure (especially where the reverse engineering is itself permitted under copyright law) the terms might be void against public policy. This is particularly true where the vulnerability itself would significantly impact public health, safety and welfare, and even more so if the vendor simply refused to take action. Imagine a contract obligating you not to look for security problems in a nuclear power plant, or to disclose any that you found. Courts would have a hard time enforcing such a provision. The problem here is that not all discoveries and disclosures are "responsible."

A guide to disassembly, legally

Here is a simple proposal. It seems to be okay to reverse engineer for the purpose of looking for security vulnerabilities provided that you disclose such vulnerabilities consistent with the rules set out for such disclosure. This would place an obligation on the software vendor to act promptly, fix and disclose the vulnerability, and to give credit for the discovery where it is due. This "right" would be implied in every EULA. If the vendor fails to act responsibly with respect to the disclosed vulnerability, the discoverer gets to not only disclose the details of the vulnerability, but also the bits of code that are vulnerable after a reasonable time has passed. This would provide a significant incentive to the vendors to act promptly. Finally, any reverse engineering for security purposes would have no effect on the "trade secret character" of the underlying copyrighted software, since it would be done pursuant to what is essentially an NDA. Of course, for this to work, everybody would have to agree on what "responsible disclosure" is. And if we could all agree on something like this, what would we need lawyers for?

Copyright © 2005,

Mark D. Rasch, J.D., is a former head of the Justice Department's computer crime unit, and now serves as Senior Vice President and Chief Security Counsel at Solutionary Inc.

More about

TIP US OFF

Send us news


Other stories you might like