Why quantum and data protection should go hand in hand

If the May 7 ransomware attack on the Colonial Pipeline was a wakeup call, you have been taking a surprisingly good sleeping pill! 

Critical infrastructure has long been a popular target for adversaries and sophisticated criminal organizations. The hacking gang and ransomware extortionists known as DarkSide infiltrated Colonial’s systems through a compromised password on an old, disused VPN account, leveraging it not only to deploy ransomware but also to steal more than 100 GB of data. 

How cybercriminals can use that information in the future depends on the data itself, but it’s clear that the vast tranches of data can have a life far beyond their immediate resale in some sketchy Dark Web forum. Stolen data can contain sensitive corporate intellectual property, personally identifiable information and protected health information such as fingerprints and DNA sequencing. More than 37 billion data records were stolen in 2020, up 141% from the year before even as the volume of publicly disclosed data breaches fell by 48%, according to Risk Based Security. Much of this encrypted data is stockpiled, waiting for the day when a quantum computer is available to easy break its encryption — an attack vector known as harvesting.

The devastating series of attacks on federal agencies and partners – SolarWinds, Microsoft Exchange Server and now the Colonial Pipeline – has prompted government to take action.  On May 12, 2021, the Biden administration announced an Executive Order on improving the nation’s cybersecurity and strengthening federal network defenses.

The EO pushes the federal government and private industry to work together to modernize cybersecurity practices, secure software development, strengthen incident response, improve threat detection and information sharing and accelerate investigation and remediation. 

Before agencies rush to check the EO boxes, they should first perform a cybersecurity risk assessment to identify gaps, define a risk profile and align mitigation priorities and strategies with their overall risk appetite. A sound enterprise risk management profile should consider modern-day vulnerabilities and present risk scenarios but also those in the foreseeable future.  

The era of quantum computing is fast approaching, and with it the greatest threat to cryptography at a scale never seen before. A quantum computer will be able to break RSA-2048, considered the gold standard for Public Key Encryption (PKE), the system that has for years protected our digital universe. While the National Institute for Standards and Technology has been working to down-select post-quantum cryptographic algorithm candidates, the standards body warns another five to 15 years may be needed before full transition is completed. Quantum risk assessment – or the steps needed to identify and safeguard infrastructure and critical data in the post-quantum world – should be part of an agency’s overall cybersecurity risk management program, current data security practices and EO compliancy efforts.

Because government data security requirements have a much longer shelf-life than other industries — up to 50 years in the case of official intelligence — it is reasonable to assume this data may be vulnerable to quantum attack if not protected properly today. Any country that attains a quantum computing system of sufficient power will be able to decrypt stored data with ease – and odds are an adversary is unlikely to divulge this intelligence advantage. These facts, and the continued use of outdated security models and unencrypted data by critical infrastructure operators, require organizations to make bold, future-proof changes to their cybersecurity practices.

Federal agencies and their partners looking to update their encryption infrastructure to comply with the EO and better protect long-duration data should conduct a quantum risk assessment and deploy quantum-safe solutions for protecting their data today and in the quantum-enabled future. Failing to do so, could cause premature obsolescence of any upgraded system or architecture — a costly and risk-intense scenario best avoided.

To expedite EO mandates and initiate cybersecurity and quantum risk assessment, agencies should begin with the following two-step process:

  • Step 1. Perform a cybersecurity risk analysis using the existing baseline derived from a qualitative approach, such as NIST’s Cybersecurity Framework, then build to a quantitative approach (e.g., Sim Segal’s value-based enterprise risk management process) to measure the impact on the organization. Using these findings, agencies leaders can prioritize cybersecurity next steps and budget allocation.
  • Step 2. Determine the value of data focusing on persistent data (time to value) to incorporate zero-trust and layered security architectures and solutions. Then, secure existing network communications links by preventing the theft of the encryption keys. Next-generation key distribution systems are turning PKE on its head, by leveraging out-of-band key delivery that separates the encryption key from the data path, sent over a separate quantum-safe network. This architectural approach to encrypting data in motion, where two keys are in play, does not impact network performance or reliability, meets many of the mandates outlined in Section 3 of the EO and extends the life of existing encryption infrastructures into the quantum era with minimal lift or outlay.

The White House Executive Order on Improving the Nation’s Cybersecurity is a good step forward in advancing efforts by the nation and industry to identify, deter and protect against malicious actions and actors. We must now each do our part to ensure this matter of national security and trust is a success by implementing controls to manage and mitigate not only the cyberattacks of today but also the potential threats of tomorrow.

Leave a Reply

Your email address will not be published.