Last updated at Thu, 09 Jul 2020 20:23:53 GMT
A major area of focus in the current cybersecurity policy discussion is how growing adoption of encryption impacts law enforcement and national security, and whether new policies should be developed in response. This post briefly evaluates several potential outcomes of the debate, and provides Rapid7's current position on each.
Rapid7 has great respect for the work of our law enforcement and intelligence agencies. As a cybersecurity company that constantly strives to protect our clients from cybercrime and industrial espionage, we appreciate law enforcement's role in deterring and prosecuting wrongdoers. We also recognize the critical need for effective technical tools to counter the serious and growing threats to our networks and personal devices. Encryption is one such tool.
Encryption is a fundamental means of protecting data from unauthorized access or use. Commerce, government, and individual internet users depend on strong security for our communications. For example, encryption helps prevent unauthorized parties from reading sensitive communications – like banking or health information – traveling over the internet. Another example: encryption underpins certificates that demonstrate authenticity (am I who I say I am?), so that we can have high confidence that a digital communication – such as a computer software security update – is coming from the right source and not a man-in-the-middle attacker. The growing adoption of encryption for features like these has made users much more safe than we would be without it. Rapid7 believes companies and technology innovators should be able to use the encryption protocols that best protect their customers and fit their service model – whether that protocol is end-to-end encryption or some other system.
However, we also recognize this increased data security creates a security trade-off. Law enforcement will at times encounter encryption that it cannot break by brute force and for which only the user – not the software vendor – has the key, and this will hinder lawful searches. The FBI's recently concluded efforts to access the cell phone belonging to deceased terrorist Syed Farook of San Bernardino, California, was a case study in this very issue. Although the prevalence of systems currently secured with end-to-end encryption with no other means of access should not be overstated, law enforcement search attempts may be thwarted more often as communications evolve to use unbreakable encryption with greater frequency. This prospect has tempted government agencies to seek novel ways around encryption. While we do not find fault with law enforcement agencies attempting to execute valid search or surveillance orders, several of the options under debate for circumventing encryption pose broad negative implications for cybersecurity.
One option under discussion is a legal requirement that companies weaken encryption by creating a means of "exceptional access" to software and communications services that government agencies can use to unlock encrypted data. This option could take two forms – one in which the government agencies hold the decryption keys (unmediated access), and one in which the software creator or another third party holds the decryption keys (mediated access). Both models would impose significant security risks for the underlying software or service by creating attack surfaces for bad actors, including cybercriminals and unfriendly international governments. For this reason, Rapid7 does not support a legal requirement for companies or developers to undermine encryption for facilitating government access to encrypted data.
The huge diversity of modern communications platforms and software architecture makes it impossible to implement a one-size-fits-all backdoor into encryption. Instead, to comply with a hypothetical mandate to weaken encryption, different companies are likely to build different types of exceptional access. Some encryption backdoors will be inherently more or less secure than others due to technical considerations, the availability of company resources to defend the backdoor against insider and external threats, the attractiveness of client data to bad actors, and other factors. The resulting environment would most likely be highly complex, vulnerable to misuse, and burdensome to businesses and innovators.
Rapid7 also shares concerns that requiring US companies to provide exceptional access to encrypted communications for US government agencies would lead to sustained pressure from many jurisdictions – both local and worldwide – for similar access. Companies or oversight bodies may face significant challenges in accurately tracking when, by whom, and under what circumstances client data is accessed – especially if governments have unmediated access to decryption keys. If US products are designed to be inherently insecure and "surveillance-ready," then US companies will face a considerable competitive disadvantage in international markets where more secure products are available.
Legal mandates to weaken encryption are unlikely to keep unbreakable encryption out of the hands of well-resourced criminals and terrorists. Open source software is commonly "forked," and it should be expected that developers will modify open source software to remove an encryption backdoor. Jurisdictions without an exceptional access requirement could still distribute closed source software with unbreakable encryption. As a result, the cybersecurity risks of weakened encryption are especially likely to fall on users who are not already security-conscious enough to seek out these workarounds.
Intentionally weakening encryption or other technical protections ultimately undermines the security of the end users, businesses, and governments. That said, if companies or software creators voluntarily choose to build exceptional access mechanisms into their encryption, Rapid7 believes it is their right to do so. However, we would not recommend doing so, and we believe companies and creators should be as transparent as possible with their users about any such feature.
"Technical assistance" – compelled malware
Another option under debate is whether the government can force developers to build custom software that removes security features of the developers' products. This prospect arose in connection with the FBI's now-concluded bid to unlock Farook's encrypted iPhone to retrieve evidence for its terrorism investigation. In that case, a magistrate judge ordered Apple to develop and sign a custom build of iOS that would disable several security features preventing the FBI from using electronic means to quickly crack the phone's passcode via brute force. This custom version of iOS would have been deployed like a firmware update only to the deceased terrorist's iPhone, and Apple would have maintained control of both the iPhone and the custom iOS. However, the FBI ultimately cracked the iPhone without Apple's assistance – with help, according to some reports, from a third party company – and asked the court to vacate its order against Apple. Still, it's possible that law enforcement agencies could again attempt to legally compel companies to hack their own products in the future.
In the Farook case, the government had good reason to examine the contents of the iPhone, and clearly took steps to help prevent the custom software from escaping into the wild. This was not a backdoor or exceptional access to encryption as traditionally conceived, and not entirely dissimilar to cooperation Apple has offered law enforcement in the past for unencrypted older versions of iOS. Nonetheless, the legal precedent that would be set if a court compels a company or developer to create malware to weaken its own software could have broad implications that are harmful to cybersecurity.
FBI Director James Comey confirmed in testimony before Congress that if the government succeeded in court against Apple, law enforcement agencies would likely use the precedent as justification to demand companies create custom software in the future. It's possible the precedent could be applied to a prolonged wiretap of users of an encrypted messaging service like WhatsApp, or a range of other circumstances. Establishing the limits of this authority would be quite important.
If the government consistently compelled companies to create custom software to undermine the security of their own products, the effect could be proliferation of company-created malware. Companies would need to defend their malware from misuse by both insiders and external threats while potentially deploying the malware to comply with many government demands worldwide, which – like defending an encryption backdoor – would be considerably burdensome on companies. This outcome could reduce user trust in the security of vendor-issued software updates, even though it is generally critical for cybersecurity for users to keep their software as up to date as possible. Companies may also design their products to be less secure from the outset, in anticipation of future legal orders to circumvent their own security.
These scenarios raise difficult questions for cybersecurity researchers and firms like Rapid7. Government search and surveillance demands are frequently paired with gag orders that forbid the recipient (such as the individual user or a third party service provider) from discussing the demands. Could this practice impact public disclosure or company acknowledgment of a vulnerability when researchers discover a security flaw or threat signature originating from software a company is compelled to create for law enforcement? When would a company be free to fix its government-ordered vulnerability? Would cybersecurity firms be able to wholeheartedly recommend clients accept vendor software updates?
Rapid7 does not support legal requirements – whether via legislation or court order – compelling companies to create custom software to degrade security. Creating secure software is very difficult under the best of circumstances, and forcing companies to actively undermine their own security features would undo decades of security learnings and practice. If the government were to compel companies to provide access to its products, Rapid7 believes it would be preferable to use tools already available to the companies (such as that which Apple offered prior to iOS 8) in limited circumstances that do not put non-targeted users at risk. If a company has no means to crack its products already available, the government should not compel a company to create custom software to undermine their products' security features. Software developers should also be free to develop patches or introduce more secure versions of their products to fix vulnerabilities at any time.
Government hacking and forensics
Finally, there is the option of government deploying its own tools to hack products and services to obtain information. End-to-end encryption provides limited protection when one of the endpoints is compromised. If government agencies do not compel companies to weaken their own products, they could exploit existing vulnerabilities themselves. As noted above, the government's exploitation of existing vulnerabilities was the outcome of the FBI's effort to compel Apple to provide access to Farook's iPhone. Government has also turned to hacking or implanting malware in other contexts well before the Farook case.
In many ways, this activity is to be expected. It is not an irrational priority for law enforcement agencies to modernize their computer penetration capabilities to be commensurate with savvy adversaries. A higher level of hacking and digital forensic expertise for law enforcement agencies should improve their ability to combat cybercriminals more generally. However, this approach raises its own set of important questions related to transparency and due process.
Upgrading the technological expertise of law enforcement agencies will take time, education, and resources. It will also require thoughtful policy discussions on what the appropriate rules for government hacking should be – there are few clear and publicly available standards for government use of malware. One potentially negative outcome would be government stockpiling of zero day vulnerabilities for use in investigations, without disclosing the vulnerabilities to vendors or the public. The picture is clouded further when the government partners with third party organizations to hack on the government's behalf, as may have occurred in the case of Farook's iPhone – if the third party owns a software exploit, could IP or licensing agreements prevent the government from disclosing the vulnerability to the vendor? White House Cybersecurity Coordinator Michael Daniel noted there were "few hard and fast rules" for disclosing vulnerabilities, but pointed out that zero day stockpiles put Internet users at risk and would not be in the interests of national security. We agree and appreciate the default of vulnerability disclosure, but clearer rules on transparency and due process in the context of government hacking are quickly becoming increasingly important.
No easy answers
We view the complex issue of encryption and law enforcement access as security versus security. To us, the best path forward is that which would provide the best security for the most number of individuals. To that end, Rapid7 believes that we should embrace the use of strong encryption without compelling companies to create software that undermines their product security features. We want the government to help prevent crime by working with the private sector to make communications services, commercial products, and critical infrastructure trustworthy and resilient. The foundation of greater cybersecurity will benefit us all in the future.
Director of Public Policy, Rapid7