Have we passed the Inflection Point of Quantum Computing?
How quantum, AI, energy, and cryptography are converging faster than expected
The last six months may mark a genuine turning point for quantum computing. Across hardware, cryptography, energy, and infrastructure, the signals are converging: quantum is no longer theoretical. It is impacting infrastructure. After National Institute of Standards and Technology (NIST) finalized the first set of post-quantum cryptography (PQC) standards, federal agencies began migrating to supported devices and last week, the Cybersecurity and Infrastructure Security Agency (CISA) published its initial list of PQC-capable product categories focusing on cloud, networking and security, while FedRAMP is integrating these requirements into updated authorization processes to ensure vendor readiness. In my perspective, by the end of 2027 is when Q-Day (when encrypted data could be cracked by quantum computers) could occur for the highest bidders of the quantum computing time and power needed - is your data organized and do you know what devices in your network can be migrated to PQC ciphers versus replaced with those that can protect your data in transit and storage?
The Hardware Moment
In September 2025, Harvard researchers demonstrated the first quantum computer capable of running continuously for two hours, and theoretically indefinitely, by replenishing lost atoms during computation. Using what they call an "optical lattice conveyor belt" and optical tweezers, the team maintained a 3,000-qubit neutral-atom system by reloading up to 300,000 atoms per second. This approach addresses one of the most persistent challenges in quantum systems: atom loss. As Mikhail Lukin, the project's senior author, put it, the ability to replace qubits during operation without destroying existing information represents a fundamental bottleneck solved.
Meanwhile, IonQ achieved a milestone that had been an industry north star for decades: 99.99% two-qubit gate fidelity. This "four nines" benchmark, demonstrated on their Electronic Qubit Control technology, represents a 10× improvement over the 99.9% standard and dramatically reduces error correction requirements. The company projects this will form the basis for 256-qubit systems in 2026 and a path to millions of qubits by 2030.
Perhaps most telling was NVIDIA CEO Jensen Huang's June 2025 reversal on quantum timelines. After declaring last January that useful quantum computers were "at least 15 years away," Huang told the GTC Paris conference in June that quantum computing was reaching "an inflection point." The shift reflects broader industry momentum: IBM's roadmap now targets fault-tolerant systems by 2029, and NVIDIA has made substantial venture investments in quantum hardware companies including PsiQuantum, Quantinuum, and QuEra.
A Nobel Prize That Explains the Foundation
The 2025 Nobel Prize in Physics recognized John Clarke, Michel Devoret, and John Martinis for work conducted in 1984-1985 demonstrating quantum tunneling and discrete energy behavior in macroscopic Josephson-junction circuits. Their experiments proved that quantum mechanics could operate in systems large enough to hold in your hand. These superconducting circuits are the direct ancestors of today's qubits used by Google, IBM, and others. The Nobel Committee explicitly noted that the laureates' work "laid the foundation for superconducting quantum technologies, including the qubits now used in quantum computers."
Post-Quantum Cryptography: From Standards to Implementation
The cryptographic transition is no longer optional—it's operational. NIST finalized its first three PQC standards in August 2024: ML-KEM (FIPS 203, formerly CRYSTALS-Kyber) for general encryption and key exchange, ML-DSA (FIPS 204, formerly CRYSTALS-Dilithium) for digital signatures and authentication, and SLH-DSA (FIPS 205, formerly SPHINCS+) as a hash-based signature backup. In March 2025, NIST added HQC (Hamming Quasi-Cyclic) as a fifth algorithm, providing a code-based backup to the lattice-based ML-KEM. FALCON (FIPS 206), optimized for scenarios requiring smaller signature sizes, remains in development.
U.S. Federal agencies are now racing for QPC-ready hardware, but the implications extend far beyond government. Any organization handling data with long-term sensitivity from healthcare records, financial instruments, intellectual property, to infrastructure configurations faces the "harvest now, decrypt later" threat. Adversaries are already collecting encrypted traffic with the expectation that quantum computers will eventually crack it. If your sensitive data has a shelf life extending beyond 2030, the migration window is now.
The recommended approach in 2026 is hybrid cryptography: running classical algorithms (RSA, ECC) alongside quantum-resistant ones simultaneously. This protects against both current and future threats while providing a fallback if any PQC algorithm proves weaker than expected. Organizations should be designing for crypto-agility - the ability to swap cryptographic primitives without rebuilding systems from scratch. Hardware acceleration for PQC is advancing rapidly, addressing the computational overhead that early implementations introduced.
Signal Shows What Deployment Looks Like
In October 2025, Signal deployed a significant upgrade to its encryption protocol: the Sparse Post-Quantum Ratchet (SPQR). Combined with their existing Double Ratchet to form the "Triple Ratchet," this hybrid approach uses ML-KEM alongside classical elliptic-curve cryptography. The design ensures that an attacker would need to break both systems to decrypt messages, providing protection against future quantum attacks while maintaining current security guarantees.
Signal's rollout is gradual and backward-compatible; conversations automatically upgrade when both parties have the new software. While not yet complete across all users, the move represents the first major consumer-facing application to integrate post-quantum cryptography at scale, with plans to make SPQR mandatory for all new conversations once adoption is widespread. For enterprise technology leaders, Signal's approach offers a template: hybrid deployment, graceful degradation, and incremental migration rather than a single cutover.
Scaling Beyond the 100-Qubit Barrier
For nearly a decade, the quantum industry has been stuck at roughly 100 qubits per processor. Google went from 53 to 105 qubits in six years; IBM's latest roadmap shows 120 qubits as the leading device size through 2028. The bottleneck isn't just physics, but also it's wiring. In current 2D chip designs, approximately 90% of the chip area goes to signal routing rather than qubits.
In December 2025, QuantWare announced VIO-40K, a 3D scaling architecture that addresses this directly. By delivering signals vertically through stacked chiplet modules, the approach supports up to 40,000 input-output lines and enables 10,000-qubit processors (a 100× increase over current systems). The architecture is designed to work with NVIDIA's NVQLink for hybrid quantum-classical computing. However, the first devices won't ship until 2028, and raw qubit count alone doesn't guarantee fault-tolerant operation. Error correction, control fidelity, and energy efficiency at the logical-qubit level remain essential challenges.
The Cryptographic Timeline: What Leaders Need to Know
Estimating when quantum computers will break current encryption remains genuinely difficult, but the estimates are compressing. A May 2025 paper from Google Quantum AI researcher Craig Gidney dramatically reduced projections for breaking RSA-2048: from the 20 million noisy qubits projected in 2019 to fewer than one million qubits running for under a week. The reduction comes from algorithmic improvements and more efficient error correction, not hardware breakthroughs alone.
For Bitcoin's ECDSA signatures (using the secp256k1 curve), estimates suggest roughly 1,673-2,619 logical qubits would be needed (fewer than RSA-2048's requirements). Current expert consensus places the probability of a cryptanalytically relevant quantum computer existing by 2035 at greater than 50%, with some analyses suggesting the 2029-2032 window as most likely for initial capability.
For practical planning purposes, NIST's guidance to deprecate vulnerable algorithms by 2035 should be treated as a ceiling, not a target. Organizations should assume that well-resourced adversaries (e.g. nation-states, sophisticated criminal enterprises) will have access to quantum decryption capability before it becomes commercially available. The question isn't whether to migrate, but whether your migration timeline accounts for the data you're protecting today being vulnerable tomorrow.
Action Items for Technology and Business Leaders
The first order of business on any business leader’s Quantum readiness roadmap is to conduct a cryptographic inventory. Identify every system, device, and data store using RSA, ECC, or other vulnerable algorithms. This includes not just your applications but embedded systems, IoT devices, VPNs, TLS certificates, and third-party integrations. Many legacy devices cannot be upgraded and will need replacement.
Second, assess data sensitivity lifespans. Data that must remain confidential for 10+ years, including patient records, trade secrets, infrastructure configurations, and long-term contracts, requires immediate attention. Shorter-lived data can follow a longer migration path.
Third, design for crypto-agility. Any system built or upgraded in 2026 should support algorithm substitution without architectural changes. This means abstracting cryptographic functions, maintaining clear documentation of cryptographic dependencies, and planning for ongoing algorithm updates as the PQC landscape matures.
Fourth, implement hybrid deployments where possible. Running ML-KEM alongside existing key exchange (as Signal demonstrates) provides immediate protection without abandoning proven security. This approach is now the industry standard recommendation.
Fifth, engage with NIST, CISA, and the NCCoE migration resources. These agencies are providing implementation guidance, testing tools, and reference architectures specifically designed to help organizations transition. The Migration to Post-Quantum Cryptography project at NCCoE offers practical demonstrations and interoperability testing.
Computing at the Edge of Earth
As AI workloads consume ever more energy, some companies are looking beyond terrestrial constraints entirely. In November 2025, Starcloud launched the first NVIDIA H100 GPU into orbit aboard a SpaceX rocket. The 60kg satellite successfully ran inference on Google's Gemma model and trained NanoGPT on Shakespeare's complete works, becoming the first AI model trained in space.
Crusoe, an AI infrastructure company that has built its business model around co-locating compute with stranded energy sources, announced a partnership with Starcloud to deploy cloud workloads on a satellite launching in late 2026, with limited GPU capacity expected from orbit by early 2027. The value proposition is straightforward: space offers abundant solar energy without atmospheric losses, no land use requirements, and the vacuum serves as an effectively infinite heat sink. Whether orbital compute proves economically viable at scale remains to be demonstrated, but the first experiments are now underway.
Quantum Becomes Infrastructure
What makes the current moment feel different isn't any single breakthrough, but rather the convergence of so many breakthroughs. Hardware is scaling in multiple modalities simultaneously. Post-quantum cryptography has moved from standards documents to deployed code, with federal mandates driving adoption. Energy constraints are pushing infrastructure toward novel solutions. And the Nobel Prize committee chose this year to recognize the foundational work that made superconducting qubits possible.
None of this means quantum computing has arrived for general-purpose use. Error correction remains hard. Logical qubit counts remain low. Most applications still require breakthroughs we haven't achieved. But the transition from "when" to "how fast,” aka from theoretical possibility to engineering challenge, appears underway. Quantum is becoming infrastructure.
For technology and business leaders, the strategic question has shifted. It's no longer about monitoring quantum progress from a distance. It's about whether your organization's cryptographic foundations will survive the transition - are you building the visibility and agility to adapt as the timeline continues to compress?