Page de couverture de The New Quantum Era - innovation in quantum computing, science and technology

The New Quantum Era - innovation in quantum computing, science and technology

The New Quantum Era - innovation in quantum computing, science and technology

Auteur(s): Sebastian Hassinger
Écouter gratuitement

À propos de cet audio

Your host, Sebastian Hassinger, interviews brilliant research scientists, software developers, engineers and others actively exploring the possibilities of our new quantum era. We will cover topics in quantum computing, networking and sensing, focusing on hardware, algorithms and general theory. The show aims for accessibility - Sebastian is not a physicist - and we'll try to provide context for the terminology and glimpses at the fascinating history of this new field as it evolves in real time.(c) Sebastian Hassinger 2025 Physique Science
Épisodes
  • Peaked quantum circuits with Hrank Gharibyan
    Dec 12 2025
    In this episode of The New Quantum Era, Sebastian talks with Hrant Gharibyan, CEO and co‑founder of BlueQubit, about “peaked circuits” and the challenge of verifying quantum advantage. They unpack Scott Aaronson and Yushuai Zhang’s original peaked‑circuit proposal, BlueQubit’s scalable implementation on real hardware, and a new public challenge that invites the community to attack their construction using the best classical algorithms available. Along the way, they explore how this line of work connects to cryptography, hardness assumptions, and the near‑term role of quantum devices as powerful scientific instruments.Topics CoveredWhy verifying quantum advantage is hard The core problem: if a quantum device claims to solve a task that is classi-cally intractable, how can anyone check that it did the right thing? Random circuit sampling (as in Google’s 2019 “supremacy” experiment and follow‑on work from Google and Quantinuum) is believed to be classically hard to simulate, but the verification metrics (like cross‑entropy benchmarking) are themselves classically intractable at scale.What are peaked circuits? Aaronson and Zhang’s idea: construct circuits that look like random circuits in every respect, but whose output distribution secretly has one special bit string with an anomalously high probability (the “peak”). The designer knows the secret bit string, so a quantum device can be verified by checking that measurement statistics visibly reveal the peak in a modest number of shots, while finding that same peak classically should be as hard as simulating a random circuit.BlueQubit’s scalable construction and hardware demo BlueQubit extended the original 24‑qubit, simulator‑based peaked‑circuit construction to much larger sizes using new classical protocols. Hrant explains their protocol for building peaked circuits on Quantinuum’s H2 processor with around 56 qubits, thousands of gates, and effectively all‑to‑all connectivity, while still hiding a single secret bit string that appears as a clear peak when run on the device.Obfuscation tricks and “quantum steganography” The team uses multiple obfuscation layers (including “swap” and “sweeping” tricks) to transform simple peaked circuits into ones that are statistically indistinguishable from generic random circuits, yet still preserve the hidden peak.The BlueQubit Quantum Advantage Challenge To stress‑test their hardness assumptions, BlueQubit has published concrete circuits and launched a public bounty (currently a quarter of a bitcoin) for anyone who can recover the secret bit string classically. The aim is to catalyze work on better classical simulation and de‑quantization techniques; either someone closes the gap (forcing the protocol to evolve) or the standing bounty helps establish public trust that the task really is classically infeasible.Potential cryptographic angles Although the main focus is verification of quantum advantage, Hrant outlines how the construction has a cryptographic flavor: a secret bit string effectively acts as a key, and only a sufficiently powerful quantum device can efficiently “decrypt” it by revealing the peak. Variants of the protocol could, in principle, yield schemes that are classically secure but only decryptable by quantum hardware, and even quantum‑plus‑key secure, though this remains speculative and secondary to the verification use case. From verification protocol to startup roadmap Hrant positions BlueQubit as an algorithm and capability company: deeply hardware‑aware, but focused on building and analyzing advantage‑style algorithms tailored to specific devices. The peaked‑circuit work is one pillar in a broader effort that includes near‑term scientific applications in condensed‑matter physics and materials (e.g., Fermi–Hubbard models and out‑of‑time‑ordered correlators) where quantum devices can already probe regimes beyond leading classical methods.Scientific advantage today, commercial advantage tomorrow Sebastian and Hrant emphasize that the first durable quantum advantages are likely to appear in scientific computing—acting as exotic lab instruments for physicists, chemists, and materials scientists—well before mass‑market “killer apps” arrive. Once robust, verifiable scientific advantage is established, scaling to larger models and more complex systems becomes a question of engineering, with clear lines of sight to industrial impact in sectors like pharmaceuticals, advanced materials, and manufacturing.The challenge: https://app.bluequbit.io/hackathons/
    Voir plus Voir moins
    30 min
  • Diamond vacancies and scalable qubits with Quantum Brilliance
    Dec 6 2025

    Episode overview
    This episode of The New Quantum Era features a conversation with Quantum Brilliance co‑founder and CEO Mark Luo and independent board chair Brian Wong about diamond nitrogen vacancy (NV) centers as a platform for both quantum computing and quantum sensing. The discussion covers how NV centers work, what makes diamond‑based qubits attractive at room temperature, and how to turn a lab technology into a scalable product and business.

    What are diamond NV qubits?
    Mark explains how nitrogen vacancy centers in synthetic diamond act as stable room‑temperature qubits, with a nitrogen atom adjacent to a missing carbon atom creating a spin system that can be initialized and read out optically or electronically. The rigidity and thermal properties of diamond remove the need for cryogenics, complex laser setups, and vacuum systems, enabling compact, low‑power quantum devices that can be deployed in standard environments.

    Quantum sensing to quantum computing
    NV centers are already enabling ultra‑sensitive sensing, from nanoscale MRI and quantum microscopy to magnetometry for GPS‑free navigation and neurotech applications using diamond chips under growing brain cells. Mark and Brian frame sensing not as a hedge but as a volume driver that builds the diamond supply chain, pushes costs down, and lays the manufacturing groundwork for future quantum computing chips.

    Fabrication, scalability, and the value chain
    A key theme is the shift from early “shotgun” vacancy placement in diamond to a semiconductor‑style, wafer‑like process with high‑purity material, lithography, characterization, and yield engineering. Brian characterizes Quantum Brilliance’s strategy as “lab to fab”: deciding where to sit in the value chain, leveraging the existing semiconductor ecosystem, and building a partner network rather than owning everything from chips to compilers.

    Devices, roadmaps, and hybrid nodes
    Quantum Brilliance has deployed room‑temperature systems with a handful of physical qubits at Oak Ridge National Laboratory, Fraunhofer IAF, and the Pawsey Supercomputing Centre. Their roadmap targets application‑specific quantum computing with useful qubit counts toward the end of this decade, and lunchbox‑scale, fault‑tolerant systems with on the order of 50–60 logical qubits in the mid‑2030s.

    Modality tradeoffs and business discipline
    Mark positions diamond NV qubits as mid‑range in both speed and coherence time compared with superconducting and trapped‑ion systems, with their differentiator being compute density, energy efficiency, and ease of deployment rather than raw gate speed. Brian brings four decades of experience in semiconductors, batteries, lidar, and optical networking to emphasize milestones, early revenue from sensing, and usability—arguing that making quantum devices easy to integrate and operate is as important as the underlying physics for attracting partners, customers, and investors.

    Partners and ecosystem
    The episode underscores how collaborations with institutions such as Oak Ridge, Fraunhofer, and Pawsey, along with industrial and defense partners, help refine real‑world requirements and ensure the technology solves concrete problems rather than just hitting abstract benchmarks. By co‑designing with end users and complementary hardware and software vendors, Quantum Brilliance aims to “democratize” access to quantum devices, moving them from specialized cryogenic labs to desks, edge systems, and embedded platforms.

    Voir plus Voir moins
    37 min
  • Macroscopic Quantum Tunneling with Nobel Laureate John Martinis
    Nov 26 2025
    Episode overviewJohn Martinis, Nobel laureate and former head of Google’s quantum hardware effort, joins Sebastian Hassinger on The New Quantum Era to trace the arc of superconducting quantum circuits—from the first demonstrations of macroscopic quantum tunneling in the 1980s to today’s push for wafer-scale, manufacturable qubit processors. The episode weaves together the physics of “synthetic atoms” built from Josephson junctions, the engineering mindset needed to turn them into reliable computers, and what it will take for fabrication to unlock true large-scale quantum systems.Guest bioJohn M. Martinis is a physicist whose experiments on superconducting circuits with John Clarke and Michel Devoret at UC Berkeley established that a macroscopic electrical circuit can exhibit quantum tunneling and discrete energy levels, work recognized by the 2025 Nobel Prize in Physics “for the discovery of macroscopic quantum mechanical tunnelling and energy quantisation in an electric circuit.” He went on to lead the superconducting quantum computing effort at Google, where his team demonstrated large-scale, programmable transmon-based processors, and now heads Qolab (also referred to in the episode as CoLab), a startup focused on advanced fabrication and wafer-scale integration of superconducting qubits.Martinis’s career sits at the intersection of precision instrumentation and systems engineering, drawing on a scientific “family tree” that runs from Cambridge through John Clarke’s group at Berkeley, with strong theoretical influence from Michel Devoret and deep exposure to ion-trap work by Dave Wineland and Chris Monroe at NIST. Today his work emphasizes solving the hardest fabrication and wiring challenges—pursuing high-yield, monolithic, wafer-scale quantum processors that can ultimately host tens of thousands of reproducible qubits on a single 300 mm wafer.Key topicsMacroscopic quantum tunneling on a chip: How Clarke, Devoret, and Martinis used a current-biased Josephson junction to show that a macroscopic circuit variable obeys quantum mechanics, with microwave control revealing discrete energy levels and tunneling between states—laying the groundwork for superconducting qubits. The episode connects this early work directly to the Nobel committee’s citation and to today’s use of Josephson circuits as “synthetic atoms” for quantum computing.From DC devices to microwave qubits: Why early Josephson devices were treated as low-frequency, DC elements, and how failed experiments pushed Martinis and collaborators to re-engineer their setups with careful microwave filtering, impedance control, and dilution refrigerators—turning noisy circuits into clean, quantized systems suitable for qubits. This shift to microwave control and readout becomes the through-line from macroscopic tunneling experiments to modern transmon qubits and multi-qubit gates.Synthetic atoms vs natural atoms: The contrast between macroscopic “synthetic atoms” built from capacitors, inductors, and Josephson junctions and natural atomic systems used in ion-trap and neutral-atom experiments by groups such as Wineland and Monroe at NIST, where single-atom control made the quantum nature more obvious. The conversation highlights how both approaches converged on single-particle control, but with very different technological paths and community cultures.Ten-year learning curve for devices: How roughly a decade of experiments on quantum noise, energy levels, and escape rates in superconducting devices built confidence that these circuits were “clean enough” to support serious qubit experiments, just as early demonstrations such as Yasunobu Nakamura’s single-Cooper-pair box showed clear two-level behavior. This foundational work set the stage for the modern era of superconducting quantum computing across academia and industry.Surface code and systems thinking: Why Martinis immersed himself in the surface code, co-authoring a widely cited tutorial-style paper “Surface codes: Towards practical large-scale quantum computation” (Austin G. Fowler, Matteo Mariantoni, John M. Martinis, Andrew N. Cleland, Phys. Rev. A 86, 032324, 2012; arXiv:1208.0928), to translate error-correction theory into something experimentalists could build. He describes this as a turning point that reframed his work at UC Santa Barbara and Google around full-system design rather than isolated device physics.Fabrication as the new frontier: Martinis argues that the physics of decent transmon-style qubits is now well understood and that the real bottleneck is industrial-grade fabrication and wiring, not inventing ever more qubit variants. His company’s roadmap targets wafer-scale integration—e.g., ~100-qubit test chips scaling toward ~20,000 qubits on a 300 mm wafer—with a focus on yield, junction reproducibility, and integrated escape wiring rather than current approaches that tile many 100-qubit dies into larger systems.From lab ...
    Voir plus Voir moins
    49 min
Pas encore de commentaire