Skip to content

Instantly share code, notes, and snippets.

@tylerni7
Last active April 26, 2016 19:40
Show Gist options
  • Save tylerni7/5ae55866a44dfbf2c5d6 to your computer and use it in GitHub Desktop.
Save tylerni7/5ae55866a44dfbf2c5d6 to your computer and use it in GitHub Desktop.

BIS-2015-0011-0001 (Wassenaar) is a terrible idea. The only effect this legislation will have on cyber security is to harm legitimate researchers, and thereby make illegal activities even easier for cyber criminals.

First off, what problem is Wassenaar trying to address? Computer security has been a growing problem for the past decade, and it seems its importance has been growing at an exponential rate. There are reports in the media every week of large scale intrusions on companies and government organizations. Presumably the goal of Wassenaar is to attempt to stop or at least slow down these sorts of cyber attacks. However, it is not clear at all how Wassenaar will accomplish this. Will Wassenaar affect nation-state actors who are responsible for many of the breaches in the media? Clearly not; rogue nation-states are not going to be punished under our laws. In that case, this must help to prosecute criminals who use computers to attack corporations or people. However, what they are doing is obviously already illegal, and they clearly don't care. So it would seem this legislation will not help on that front.

Supporters of Wassenaar might view things differently. Perhaps their view is that limiting "tools of the trade" will hinder cyber criminals. Although obviously not perfectly successful, ideas such as this have been used for physical weapons in the past, so why not apply them to "cyber weapons"? The problem with this is that physical materials and weapons are much less versatile and far easier to keep track of. It's conceivable that a shipment of AK-47s could be caught at a border by customs or other agents; unfortunately there is no analogous way to keep track of digital tools that computer criminals may use. Further, physical munitions are easier to classify than digital tools. You can't do much with high-capacity magazines besides put them in guns--but so called "intrusion software" can be used by forensic investigators, malware researchers, penetration testers, and computer security educators for very legitimate uses. So what would the effects be of limiting "intrusion software"? Well, criminals will easily exfiltrate tools undetected over the internet (or in publications, or through hand carried memory cards, etc), as there is no possible way to stop them. Meanwhile legitimate researchers who wish to obey the letter of the law will be shut down and unable to protect the internet.

We can ignore the impossibility of actually controlling the exportation of "intrusion technologies" and examine what would happen to malicious actors if we were able to limit the proliferation of these technologies. Ignoring the fact that legitimate researchers are now hindered from doing their jobs to stop criminals, what impacts will criminals feel? It could be the case that certain public tools might now be difficult for criminals to access which could hamper their attacks. However, these tools are discussed in thousands of published papers, books, patents, and articles on the internet and in print. So even if it might be the case that certain pieces of software will not be available online or for purchase, nothing prevents criminal organizations or nation-states from producing more. And again, as there are no physical supplies required short of computers, there is no way to stop the creation of new tools by malicious actors.

Legislators obviously don't intend for Wassenaar to only disrupt the activities of legitimate researchers, but that is exactly what they will do. As countries have accepted "intrusion software" into their definition of dual-use technologies, security researchers have already begun to feel the impact. A couple examples are [1], a tool published online for malware and forensics analysts that was removed after German law made the tools illegal; and [2], a yearly contest in which companies such as Google and Microsoft pay researchers for "intrusion software" into their own products so that they can improve their security, has recently had trouble with researchers participating in countries where their activities may be considered illegal. As a computer security researcher, it's frightening to think about how much more difficult my job will be as other researchers stop releasing tools or information for fear of breaking the law.

Despite what anyone says, making "intrusion software" illegal or requiring an arms license is clearly already harming researchers. Currently America is one of a dwindling number of places where computer security innovation prospers and remains free. It is foolish to think that changing this will do anything but harm for our nation's security.

Although the Wassenaar does provide for the ability to get licenses in order for researchers to continue work, this is not an acceptable compromise. Independent researchers or those who cannot afford or are even rejected for licenses will be shut down; and in a community with an already high sense of paranoia, filling out an application for intent to research "cyber munitions" would be an impossibly high barrier to entry. Despite the provision for licenses, the results will stay the same: numerous legitimate and beneficial security research will halt.

One reaction to all this would be to go back and try to improve the definition of "intrusion software". However, this is a futile approach. There is no way to specifically target software used for evil versus that which is used for good. The entirety of the Anti-Virus industry is built on attempting to solve a problem even easier than this, and they still have not found an acceptable solution. Instead, we should take a step back and work to create laws that address the real issue: illegal cyber activities. Most states don't regulate lock-picks, but do regulate burglary or breaking and entering; so too should it be for computer security. Leave the tools available to those who wish to use them for good, and punish criminals who use them for illegal purposes.

As it stands, including "intrusion software" into Wassenaar actually fails on three of the four criteria that Wassenaar itself uses to add new items to the list of dual-use technologies [3]: these technologies are available globally to anyone with a computer, controlling the exportation of computer software is impossible, and creating clear and objective specifications for software used for malicious purposes is a problem known to computer scientists to be provably impossible [4].

Although I can sympathize with legislators' desires to curb the computer security problems facing the world, it is painfully obvious that these rules will do more harm than good. They harm researchers whose goal is to secure software as well as researchers who work to track down computer criminals; meanwhile actual criminals are completely untouched and can continue to operate unmolested.

If you value this nation's security, I urge you to do everything you can to prevent the proposed BIS-2015-0011-0001.

(For more technical descriptions on how "intrusion software" can be used for securing systems, [5] is an excellent article and provides a handful of concrete examples.)

[1] http://www.trapkit.de/research/sslkeyfinder/

[2] http://www.net-security.org/secworld.php?id=17961

[3] http://www.wassenaar.org/controllists/2005/Criteria_as_updated_at_the_December_2005_PLM.pdf

[4] http://en.wikipedia.org/wiki/Halting_problem

[5] https://www.usenix.org/system/files/login/articles/wassenaar.pdf

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment