Daniel Bernstein at the University of Illinois Chicago says that the US National Institute of Standards and Technology (NIST) is deliberately obscuring the level of involvement the US National Security Agency (NSA) has in developing new encryption standards for “post-quantum cryptography” (PQC). He also believes that NIST has made errors – either accidental or deliberate – in calculations describing the security of the new standards. NIST denies the claims.
“NIST isn’t following procedures designed to stop NSA from weakening PQC,” says Bernstein. “People choosing cryptographic standards should be transparently and verifiably following clear public rules so that we don’t need to worry about their motivations. NIST promised transparency and then claimed it had shown all its work, but that claim simply isn’t true.”
The mathematical problems we use to protect data are practically impossible for even the largest supercomputers to crack today. But when quantum computers become reliable and powerful enough, they will be able to break them in moments.
Although it is unclear when such computers will emerge, NIST has been running a project since 2012 to standardise a new generation of algorithms that resist their attacks. Bernstein, who coined the term post-quantum cryptography in 2003 to refer to these kinds of algorithms, says the NSA is actively engaged in putting secret weaknesses into new encryption standards that will allow them to be more easily cracked with the right knowledge. NIST’s standards are used globally, so flaws could have a large impact.
Bernstein alleges that NIST’s calculations for one of the upcoming PQC standards, Kyber512, are “glaringly wrong”, making it appear more secure than it really is. He says that NIST multiplied two numbers together when it would have been more correct to add them, resulting in an artificially high assessment of Kyber512’s robustness to attack.