Original URL: https://www.theregister.com/2005/08/23/secfocus_lynn/

Legal disassembly

Black hat, white hat

By Mark Rasch

Posted in Security, 23rd August 2005 10:16 GMT

When security researcher and ISS employee Michael Lynn went to give a presentation at the Black Hat conference in Las Vegas, little did he know he would ignite a legal firestorm questioning whether even the act of looking for security vulnerabilities violates the law.

A brief history

Lynn, in his position with ISS, apparently disassembled the source code for a particular type of Cisco router, discovered particular types of potential vulnerabilities, and prepared a PowerPoint presentation not only describing the vulnerabilities (and the potential for exploit) but also containing a small amount of the decompiled code to demonstrate how the vulnerability and potential exploit would work. Pretty standard stuff. What wasn't standard was the reaction of both Cisco and ISS - they went into federal district court in California (where Cisco was located) and asked for an injunction against both Lynn and Black Hat, not only preventing the presentation, but removing all copies of the presentation from the CDs and ripping them out of the presentation binders. Lynn resigned from ISS and gave the speech anyway, before agreeing to the injunction going forward. Copies of the presentation - Cisco code and all - are predictably available at mirror sites on the net.

Is it legal to decompile software?

The question for security researchers going forward is modeled by the Lynn saga. Is it legal to decompile source code to find vulnerabilities? Of course, the answer is mixed. Maybe it is, maybe it's not.

Much of the debate over the Lynn case has arisen in the context of "responsible disclosure," which asks whether Lynn should have told Cisco of the vulnerability, allowed them to fix it, and be done with it. The reports are that he did just that. But according to the terms of the lawsuit, both Cisco and ISS's legal claims against him arose long before the Black Hat conference. It was the very act of decompiling the code - an act Lynn took on as an employee in good standing with ISS - that apparently constituted the violation of the law, although what they sought to enjoin was the disclosure of the decompiled code.

The lawsuit alleges a somewhat convoluted legal theory of liability, and the facts spelled out are particularly turbid. Essentially, Cisco and ISS's claim is as follows: Lynn worked for ISS under a standard "non-disclosure" agreement. His employer ISS presumably bought a Cisco router, which came with software subject to an End User License Agreement (EULA). The EULA stated that the licensee (ISS) specifically agreed not to, "reverse engineer or decompile, decrypt, disassemble or otherwise reduce the Software to human-readable form, except to the extent expressly permitted under applicable law," or to, "disclose... trade secrets contained in the Software or documentation in any form to any third party..."

Cisco's theory, then, was that by decompiling the source code to find the vulnerability, Lynn (and presumably his employer, ISS) violated the terms of the EULA - a contract. This contract violation then meant that the license to acquire or use the software was violated, and Lynn was using a copyrighted work (the software) without the consent of the copyright holder - thus a copyright violation - which gets Cisco into federal court rather than state court. When Lynn and Black Hat sought to publish the bits of source code in the presentation, they were alleged to be distributing the code in violation of the EULA and copyright law, and also violating Cisco's right to protect its trade secrets. Finally, Lynn was alleged to have violated the terms of his ISS non-disclosure agreement by disclosing information at the conference that he learned "in secret" from ISS under the NDA - presumably information that ISS obtained by unlawful reverse engineering!

Michael Lynn settled the case with Cisco and ISS by essentially agreeing not to further distribute, and to destroy retained copies of the disassembled source code. Therefore, this case has little if any actual precedential value. However, Lynn also agreed to be enjoined "from unlawfully disassembling or reverse engineering Cisco code in the future."

Does this mean that Lynn admits that his disassembling the source code was unlawful, or that, in the future he will be free to continue to disassemble Cisco source code, as long as he does not do so unlawfully? This raises the ultimate question - is reverse engineering for security purposes where an EULA restricts reverse engineering unlawful?

Some case history on code disassembly

In two cases, Atari Games Corp. v. Nintendo of America, Inc. and Sega Enterprises v. Accolade, Inc., courts addressed whether video game producers could legally reverse engineer software contained in computer game consoles, to decipher the unprotected security codes necessary for game cartridges to operate on the consoles. Both courts essentially held that reverse engineering itself (provided the software was obtained lawfully) was a lawfully permitted "fair use" of the code under copyright law. But that is just copyright law.

In 2003, a federal court in the Bowers v. Baystate Technologies, Inc. case decided as a matter of contract law that the "no reverse engineering" provisions in a "shrinkwrap" EULA were truly enforceable. In other words, while you have a right to reverse engineer software, you can give up that right by entering into a contract with the software manufacturer (the copyright holder). You enter into the contract by obtaining the software. Some right. What is worse is that, despite the fact that the decompiling is fair use and therefore not a copyright violation, if the EULA says you can't do it and then you do, you violate your license agreement - and therefore it is a copyright violation. To add another twist, the Cisco EULA "contract" says that you agree not to decompile unless you are entitled to do so under law.

Now, there are lots of reasons why you would want to decompile or reverse engineer software, some are perfectly valid reasons and some are less so. Compatibility or integrity are perfectly valid reasons to have the need to decompile parts or all of the source code, and the Cisco EULA expressly provides a process for obtaining certain code necessary for compatibility - with the consent of Cisco. If you want to decompile code to commit "piracy," or another infringing purpose, this is not something the law should encourage. Certainly there are strong public policy reasons to allow responsible security researchers to pick apart the code to find vulnerabilities.

Hackers and true "black hats" will not be deterred from the terms of the EULA, however - let's face it, they probably never bought the code in the first place and never entered into the clickwrap EULA. Thus, the "bad guys" will pick apart the code to discover and exploit the vulnerabilities. The implications of the Lynn case are that you must first ask permission of the software vendor to pick apart the code. I can imagine vendors granting permission with restrictions, whereby you can reverse engineer the code to find vulnerabilities, provided that you agree to tell US about them - and never ever tell anyone else. In other words, they get the opportunity to fix the code, and nobody is the wiser.

All contracts, including those in an EULA, are subject to being declared void if they are "against public policy." There is a strong public policy goal in encouraging responsible discovery and disclosure of security vulnerabilities (though I express no opinion here about whether Lynn's disclosure was responsible or not). Where the terms of an EULA prevent such responsible disclosure, and therefore makes the world less secure (especially where the reverse engineering is itself permitted under copyright law) the terms might be void against public policy. This is particularly true where the vulnerability itself would significantly impact public health, safety and welfare, and even more so if the vendor simply refused to take action. Imagine a contract obligating you not to look for security problems in a nuclear power plant, or to disclose any that you found. Courts would have a hard time enforcing such a provision. The problem here is that not all discoveries and disclosures are "responsible."

A guide to disassembly, legally

Here is a simple proposal. It seems to be okay to reverse engineer for the purpose of looking for security vulnerabilities provided that you disclose such vulnerabilities consistent with the rules set out for such disclosure. This would place an obligation on the software vendor to act promptly, fix and disclose the vulnerability, and to give credit for the discovery where it is due. This "right" would be implied in every EULA. If the vendor fails to act responsibly with respect to the disclosed vulnerability, the discoverer gets to not only disclose the details of the vulnerability, but also the bits of code that are vulnerable after a reasonable time has passed. This would provide a significant incentive to the vendors to act promptly. Finally, any reverse engineering for security purposes would have no effect on the "trade secret character" of the underlying copyrighted software, since it would be done pursuant to what is essentially an NDA. Of course, for this to work, everybody would have to agree on what "responsible disclosure" is. And if we could all agree on something like this, what would we need lawyers for?

Copyright © 2005,

Mark D. Rasch, J.D., is a former head of the Justice Department's computer crime unit, and now serves as Senior Vice President and Chief Security Counsel at Solutionary Inc.