The Daily Qubit

LLMs improved with VQCs and quantum TNs, researchers call "First Photon Machine Learning" experimental demonstration of quantum advantage in AI, quantum simulations for insights into DNA mutations, and more.

In partnership with

Thursday, October 24th, 2024

Enjoy a nice cup of freshly brewed quantum news ☕️ 

Today’s issue includes:

  • A new method improves the efficiency and accuracy of LLMs by replacing key layers with variational quantum circuits and quantum-inspired tensor networks.

  • A quantum demonstrates quantum improvements in performing image classification tasks with single-photon energy efficiency.

  • Researchers present the first quantum simulation of conical intersections in the biomolecule cytosine.

  • Plus, a $16M quantum computing complex that runs on hydropower, a theory to solve a decades-old mystery on superconducting qubit readout, quantum music sounds, and more.

And even more research, news, & events within quantum.

QUICK BYTE: A new publication from Multiverse Computing proposes a method to improve the efficiency and accuracy of LLMs by replacing key layers with variational quantum circuits and quantum-inspired tensor networks.

DETAILS

  • The hybrid quantum architecture for LLM replaces weight matrices in layers such as Self-Attention and Multi-layer Perceptron with a combination of variational quantum circuits and tensor networks like. This retains classical LLM functionality while using quantum techniques to improve performance.

  • By incorporating quantum circuits, the model captures additional correlations that are unattainable through classical LLM architectures. The method involves decomposing weight matrices using tensor networks and applying disentanglers—quantum circuits designed to remove entanglement from these networks—which effectively improves accuracy without increasing memory requirements significantly.

  • This builds on previous work that compressed LLMs using tensor networks, but extends this by adding quantum circuits to the architecture, providing more efficient processing and greater scalability as quantum hardware continues to evolve.

  • The authors note that this quantum LLM could be among the first practical applications of quantum computing, as it aligns with the development roadmaps of quantum hardware providers, suggesting that models with hundreds of qubits and layers could soon be viable.

QUICK BYTE: A new quantum model from the Stevens Institute of Technology and Quantum Computing Inc., "First Photon Machine Learning," demonstrates the superiority of quantum superposition in performing image classification tasks with single-photon energy efficiency.

DETAILS

  • A quantum machine learning model based on single-photon interactions and inspired by quantum superposition principles demonstrated in double-slit experiments, which allows a single photon to probe multiple pixels in parallel for image recognition. Results demonstrated that this setup beats the classical theoretical limit for similar systems, achieving a classification fidelity of 30% compared to 24% for classical models.

  • The quantum classifier uses a spatial light modulator to encode image patterns and Hermite-Gaussian modes for classification. In contrast, the classical classifier uses traditional pixel detection methods, which limits performance due to the inability to capture correlations between pixels.

  • Since the neural network is implemented optically, the system is energy-efficient, and decision-making occurs upon detecting the first photon. According to the article, this means that the system consumes less than 10⁻²⁴ joules per calculation, which is notably lower compared to current state-of-the-art systems.

  • The researchers argue that this may be the first experimental demonstration of quantum advantage in AI, establishing quantum superposition as a clear performance booster in machine learning tasks as well as presenting a method to create highly energy-efficient quantum optical AI systems.

QUICK BYTE: Researchers from the University of Chicago, Mirion Technologies, and others present the first quantum simulation of conical intersections in the biomolecule cytosine using superconducting quantum computers, relevant for understanding DNA photostability.

DETAILS

  • The study explores how quantum computers can simulate conical intersections, focusing on cytosine, a nucleobase essential for both DNA and RNA, using the contracted quantum eigensolver and variational quantum deflation algorithms. CIs are related to electronic state intersections, impacting processes like DNA repair.

  • The quantum simulations used IBM’s 127-qubit superconducting quantum computer and noise-free simulators to compare the algorithms. Both CQE and VQD accurately computed near-degenerate ground and excited states, but CQE demonstrated better performance due to its physics-informed design, making it more efficient in handling CIs.

  • The results showed that CQE is highly effective in capturing the complex electronic structure of biomolecules, with applications extending to simulating nonadiabatic dynamics, which are relevant for understanding the photochemical behavior of biological molecules. Additionally, this may lead to further understanding DNA's behavior under UV radiation and its implications for biological processes like mutation and repair.

Meet your own personal AI Agent, for everything…Proxy

Imagine if you had a digital clone to do your tasks for you. Well, meet Proxy…

Last week, Convergence, the London based AI start-up revealed Proxy to the world, the first general AI Agent.

Users are asking things like “Book my trip to Paris and find a restaurant suitable for an interview” or “Order a grocery delivery for me with a custom weekly meal plan”.

You can train it how you choose, so all Proxy’s are different, and personalised to how you teach it. The more you teach it, the more it learns about your personal work flows and begins to automate them.

The Massachusetts Green High Performance Computing Center, in partnership with QuEra Computing, plans to build a $16 million Quantum Computing Complex in Holyoke, with $5 million in state funding and $11 million from QuEra. The center will use laser-controlled quantum technology, enlarging atoms to perform computations faster than traditional processors, with applications in cryptography, drug discovery, and materials science. This facility will be accessible to academic researchers through the New England Research Cloud and will use hydropower to maintain energy efficiency.

Researchers at the University of Sherbrooke have developed a theoretical framework explaining why qubit readouts in superconducting quantum computers, specifically with transmon qubits, are less accurate than predicted. Their models reveal that microwave signals used for qubit readout can excite qubits to high energy levels, causing them to lose their quantum state through ionization. This insight, supported by experimental data, resolves a decades-old mystery and could help improve the accuracy of superconducting quantum computers, reducing the need for error correction.

The U.S. Department of Energy's ARPA-E announced the Quantum Computing for Computational Chemistry (QC3) program, dedicated to developing quantum algorithms to address critical energy challenges like sustainable catalysts, superconductors, and battery chemistries. QC3 projects will push the limits of classical computational chemistry, intending for a 100x performance improvement or scalable advantage over current methods through optimized quantum solutions across software and hardware layers. These projects will target problems with potential high impact on energy efficiency and greenhouse gas reductions.

Skyrmions, stable magnetic structures with topologically protected spins, show promise for applications in microelectronics, spintronics, and quantum computing due to their resistance to noise and energy efficiency. Recent research, including 3D imaging by Lawrence Berkeley National Laboratory, highlights skyrmions' topological resilience against quantum noise, enabling potential use in more stable quantum information systems. This resilience, coupled with skyrmions' energy efficiency and scalability, may lead to skyrmion-based devices that could be used for data storage and processing in both classical and quantum technologies.

NIST has advanced 14 digital signature algorithms, including CROSS, FAEST, HAWK, and UOV, to the second round of its Additional Digital Signatures for PQC Standardization Process. This selection follows a year-long evaluation based on criteria detailed in NIST Internal Report 8528. Over the next 12-18 months, the second-round candidates may update their specifications, with results expected at the 6th NIST PQC Standardization Conference scheduled for September 2025.

LISTEN

…to the sound of “Ignacio” by Moth Quantum Research Scientist, Alex Alani as the background music in the release announcement of open-source python Quantum Audio Package


WATCH

At the Open Compute Project Global Summit, Jin-Sung Kim presents on AI for quantum computing and NVIDIA’s quantum computing strategy:

oh to ponder the many biological and physical mysteries quantum might yet solve 📸: Midjourney