## Post Quantum Cryptography: Moore's Law on Steroids

Recently, I was a member of a panel discussion where an audience member wanted to know our thoughts of how quantum computing would affect data security and confidentiality (encryption). This was an interesting question since current encryption methods would likely not be able to withstand the massive computing power that quantum computing has to offer. To get a better understanding of my answer (and Microsoft’s answer) to this question, a brief explanation of quantum computing would be greatly beneficial.

In 1965, Gordon Moore, co-founder of Fairchild Semiconductor and Intel, published a paper that theorized that the density of transistors would double roughly every two years. Simply put, computers would get twice as fast every other year. Over the last 120 years, this theory has mostly held true noting historic computing breakthroughs such as the Analytical Engine, ENIAC, Cray, to the most powerful GPUs in existence today powering AI, high end gaming, and cryptocurrency mining. The advent of quantum computing was so revolutionary, Rose's law had to be developed to model advancements in quantum computing, as Moore’s law could no longer accurately model advancements in quantum computing.

Traditional computing uses transistors to store 1’s and 0’s to represent states of data. It is binary and there is nothing in between; if a bit is not 1, it is definitely a 0. Quantum computing uses atomic structures and allows data to exist in states beyond just 0’s and 1’s, which allows a computer to view and tackle a problem from many different perspectives, rather than just one perspective at a time. The computing power potential of quantum computing has put into question if current encryption methods will be sufficient in a post-quantum world.

RSA asymmetric cryptography (commonly known as public/private key encryption) relies on the fact that large numbers are very difficult to factor. Using RSA asymmetric key cryptography, the public key is the product of two secret prime factors. The reason a public key can be freely made public is that even knowing what it is, there is no current feasible method to break it back down into the two prime factors that were multiplied to obtain to it. Multiplying two very large prime numbers is trivial for most classical computing processors, but factoring them should not be efficient and require more time than is feasible to obtain the information in some other way.

Taking a step back, the basis for all confidentiality (encryption) is to make it extremely efficient for a user to protect his or her data and to make it extremely inefficient for an attacker to decrypt the data without going through significant efforts (user’s cost verses attacker’s cost). Security researchers using statistical models of RSA prime computation and advancements in quantum computing have determined that RSA asymmetric encryption can be quantum-proof using keys that are 8-terabits long. While this seems absurdly high at the moment, quantum computing is also in its early stages where practical attacks against RSA are not quite sufficient for worry. Elliptic Curve Cryptography (ECC) is also not immune to a brute force quantum attack. The effects if quantum computing on encryption is a very nice thought exercise and the development team at Microsoft has also developed a solution for this problem.

This past week, Microsoft made headlines by acquiring GitHub for $7.5 Billion. What did not get the same amount of press coverage was that Microsoft’s development team forked the OpenVPN code to develop a Post-Quantum (PQ) version to withstand the computing advancements anticipated with quantum computing. For the truly paranoid or security conscious individual, PQCrypto VPN is freely available today for Windows and Linux and introduces new key exchange (Frodo and SIKE) and hashing algorithms (Picnic) that were specifically designed to withstand an attack from an actor with quantum computing resources available to them. Unlike public-key cryptography such as RSA and ECC, Picnic is not based on difficult problems from number theory. Instead, it uses zero-knowledge proof where Alice can convince Bob that she knows a secret, without disclosing the secret itself.

Going back to the answer I gave in the panel discussion, I highlighted the importance of securing the systems we have in place today before worrying about some theoretical attack that will come in the future. While I understand that security always has to be forward looking, there are a number of problems with current cryptographic standards that can break RSA asymmetric encryption without the aid of quantum computing. For example, the flaw in a cryptographic library from device manufacturer Infineon (dubbed ROCA) created insecure public key generation that allowed attackers to brute force an RSA 2048-bit public key for as little as $20,000 USD in AWS computing resources. A classical computer attempting to factor a properly generated 2048-bit RSA key should take longer than the age of the universe. Technology company RSA famously “allegedly" programmed a backdoor into the BSAFE cryptographic library so the United States National Security Agency could easily decrypt communications though to be secure by its users’.

It is nice to have a discussion about what’s to come and how to secure against it, but it’s equally if not more important to focus on securing systems against practical attacks that exist today. As more users and applications move outside the walls of corporate offices and corporate datacenters, companies will need to develop and implement strategies before those users and applications become mobile in order to keep them secure. Post-Quantum cryptography is still relatively new and the algorithms have not been truly tested and vetted compared to existing standards (Diffie-Hellman, AES, SHA-2) so they should be used with caution.