Dominic Jainy is a distinguished IT professional and a leading voice in the intersection of artificial intelligence, blockchain, and advanced network security. With a career dedicated to demystifying complex technological shifts, he has become a pivotal figure in helping organizations navigate the transition toward quantum-resistant infrastructures. As the industry grapples with the sunset of classical encryption, Jainy’s expertise provides a roadmap for securing data against the nascent but formidable power of quantum computing.
In this discussion, we explore the strategic and technical nuances of post-quantum cryptography, specifically focusing on the “harvest now, decrypt later” threat and the practical implementation of NIST-standardized algorithms. Jainy breaks down the engineering behind hybrid encryption models, the importance of securing the entire network path from device to origin, and the shifting regulatory landscape that is forcing the private sector to accelerate its defensive timelines.
“Harvest now, decrypt later” attacks involve actors collecting encrypted traffic to break it once quantum computers arrive. Which specific types of corporate data are most vulnerable to this long-term threat, and how should security teams prioritize which assets to migrate to post-quantum standards first?
The reality of “harvest now, decrypt later” is that any data with a shelf life of ten years or more is currently at high risk. We are talking about intellectual property, trade secrets, and long-term financial strategies that could still be damaging if exposed in 2035. Specifically, health records and classified government communications are the most vulnerable because their sensitivity doesn’t expire; a patient’s genetic profile or a state secret remains relevant for decades. Security teams need to perform a “cryptographic inventory” to identify these long-term assets and move them to the front of the migration line. By prioritizing the most enduring data, organizations can ensure that even if a nation-state actor is stockpiling their traffic today, the “prize” they unlock in the future will be mathematically indecipherable.
Implementing a hybrid model combines classical X25519 with newer ML-KEM algorithms to provide a safety net. What are the engineering hurdles of managing these dual handshakes simultaneously, and how do you ensure that the additional data overhead doesn’t negatively impact user latency or connection speeds?
The primary engineering hurdle is ensuring that the system can gracefully handle a “belt-and-suspenders” approach without breaking existing protocols. We are essentially nesting a newer, unproven algorithm like ML-KEM-768 inside the tried-and-true X25519 handshake, which requires careful orchestration at the TLS layer. While the hybrid model adds about 1 kilobyte of data to the initial connection setup, our testing shows that this overhead is negligible on modern high-speed networks. The computational cost is also minimal; modern CPUs handle these extra cycles so efficiently that the average employee won’t feel a single millisecond of drag. We focus on making this transition transparent, so the security is “always on” without the user ever realizing their packets are carrying a quantum-resistant shield.
Securing the entire path from a remote employee’s device to a private origin server requires post-quantum protection at every network segment. How does integrating these keys into device agents and server-side tunnels simplify the transition, and what steps must be taken to prevent any “weak links” in the chain?
Integrating post-quantum keys directly into tools like the WARP device agent and server-side tunnels is a game-changer because it removes the burden of manual configuration from the IT staff. Instead of rearchitecting the internal network, an admin can simply push a software update to the edge and the origin connector. This creates a “secure tunnel” where the data is encrypted at the laptop, remains protected through the global network, and is only decrypted once it reaches the corporate application. To prevent weak links, it is vital to ensure that every hop—from the endpoint to the gateway and then to the data center—is utilizing the same ML-KEM standard. If even one segment falls back to classical-only encryption, the entire path becomes susceptible to future decryption.
Federal mandates and NIST standards are now pushing the private sector toward quantum-resistant frameworks. How will these new requirements change the way organizations evaluate SASE providers, and what specific metrics should they use to verify that a platform offers true end-to-end protection rather than just edge security?
The shift toward NIST standards like FIPS 203 means that “post-quantum” will soon move from a marketing buzzword to a mandatory compliance checkbox, much like FedRAMP. Organizations will stop looking just at “edge security” and start demanding proof of end-to-end resilience. When evaluating a SASE provider, the key metric is “cryptographic coverage”: does the post-quantum protection extend all the way to the origin server, or does it stop at the provider’s nearest data center? You should ask for documentation on whether they use hybrid key exchanges and if their internal backbone traffic is also shielded. A provider that only secures the first mile is leaving the most sensitive part of the journey—the internal corporate traffic—exposed to long-term harvesting.
While key encapsulation is becoming more common, post-quantum digital signatures still face challenges due to their larger size. What are the primary technical obstacles to implementing these signatures across the current web trust model, and how might this affect the performance of encrypted handshakes in the future?
The obstacle with digital signatures like ML-DSA is that they are significantly bulkier than the ECDSA signatures we use today, often by an order of magnitude. In the current Web PKI model, these large signatures must be sent during the initial handshake, which can lead to packet fragmentation and increased latency. This is why many platforms are prioritizing key encapsulation (ML-KEM) first—it’s the “low-hanging fruit” that protects the data itself. Transitioning signatures will eventually require updates to Certificate Authorities and potentially a rethink of how certificates are bundled and transmitted. If we don’t optimize this, we could see a noticeable slowdown in how quickly websites load, as the “handshake” becomes a much heavier conversation between the client and the server.
Cryptographic transitions, such as the shift to TLS 1.3, historically take many years to complete. Given that quantum-resistant updates require changes to both client and server software, what are the practical difficulties of achieving 100% adoption across legacy enterprise environments and diverse hardware fleets?
Achieving total adoption is a marathon because enterprise environments are often cluttered with “black box” legacy systems that don’t support modern TLS libraries. We saw with the SHA-1 to SHA-2 migration that it took over 10 years for the industry to fully move on, and many internal servers still linger on outdated protocols. The difficulty lies in the sheer variety of hardware, from old industrial sensors to proprietary mainframes that cannot easily be patched to support ML-KEM. To get to 100%, organizations must adopt a “proxy-first” strategy, where a modern gateway handles the quantum-resistant handshake on behalf of the legacy equipment. It’s about building a protective shell around the old tech until it can eventually be decommissioned or replaced.
What is your forecast for post-quantum cryptography?
I anticipate that by 2027, post-quantum cryptography will no longer be an optional “opt-in” feature but will become the default standard for all major cloud and SASE providers. We will likely see a rapid consolidation where vendors who haven’t integrated NIST-standardized algorithms lose significant market share in the government and financial sectors. As browsers like Chrome and Firefox continue to bake these protocols into their cores, the “quantum-secure web” will happen almost invisibly for the average consumer. However, the real battle will remain in the deep enterprise layers, where the “harvest now, decrypt later” threat will force a decade-long cleaning of the cryptographic “closet” to eliminate legacy vulnerabilities. We are entering an era where mathematical resilience is the only true perimeter.
