Publication Library
Quantum Computing Enhanced Sensing
Description: Quantum computing and quantum sensing represent two distinct frontiers of quantum information science. In this work, we harness quantum computing to solve a fundamental and practically important sensing problem: the detection of weak oscillating fields with unknown strength and frequency. We present a quantum computing enhanced sensing protocol that outperforms all existing approaches. Furthermore, we prove our approach is optimal by establishing the Grover-Heisenberg limit -- a fundamental lower bound on the minimum sensing time. The key idea is to robustly digitize the continuous, analog signal into a discrete operation, which is then integrated into a quantum algorithm. Our metrological gain originates from quantum computation, distinguishing our protocol from conventional sensing approaches. Indeed, we prove that broad classes of protocols based on quantum Fisher information, finite-lifetime quantum memory, or classical signal processing are strictly less powerful. Our protocol is compatible with multiple experimental platforms. We propose and analyze a proof-of-principle experiment using nitrogen-vacancy centers, where meaningful improvements are achievable using current technology. This work establishes quantum computation as a powerful new resource for advancing sensing capabilities.
Created At: 18 January 2025
Updated At: 18 January 2025
Modeling Entanglement-Based Quantum Key Distribution for the NASA Quantum Communications Analysis Suite
Description: One of the most practical, and sought after, applications of quantum mechanics in the field of information science is the use of entanglement distribution to communicate quantum information effectively. Similar to the continued improvements of functional quantum computers over the past decade, advances in demonstrations of entanglement distribution over long distances may enable new applications in aeronautics and space communications. The existing NASA Quantum Communications Analysis Suite (NQCAS) software models such applications, but limited experimental data exists to verify the model's theoretical results. There is, however, a large body of experimental data in the relevant literature for entanglement-based quantum key distribution (QKD). This paper details a Monte Carlo-based QKD model that uses NQCAS input parameters to generate an estimated QKD link budget for verification of NQCAS. The model generates link budget statistics like key rates, error rates, and S values that can then be compared to the experimental values in the literature. Preliminary comparisons show many similarities between the simulated and experimental data, supporting the model's validity. A verified NQCAS model will inform experimental work conducted in Glenn Research Center's (GRC) NASA Quantum Metrology Laboratory (NQML), supporting the United States Quantum Initiative and potential NASA missions.
Created At: 18 January 2025
Updated At: 18 January 2025
Quantum algorithm for the gradient of a logarithm-determinant
Description: The logarithm-determinant is a common quantity in many areas of physics and computer science. Derivatives of the logarithm-determinant compute physically relevant quantities in statistical physics models, quantum field theories, as well as the inverses of matrices. A multi-variable version of the quantum gradient algorithm is developed here to evaluate the derivative of the logarithm-determinant. From this, the inverse of a sparse-rank input operator may be determined efficiently. Measuring an expectation value of the quantum state--instead of all N2 elements of the input operator--can be accomplished in O(kσ) time in the idealized case for k relevant eigenvectors of the input matrix. A factor σ=1ε3 or −1ε2log2ε depends on the version of the quantum Fourier transform used for a precision ε. Practical implementation of the required operator will likely need log2N overhead, giving an overall complexity of O(kσlog2N). The method applies widely and converges super-linearly in k when the condition number is high. For non-sparse-rank inputs, the algorithm can be evaluated provided that an equal superposition of eigenstates is provided. The output is given in O(1) queries of oracle, which is given explicitly here and only relies on time-evolution operators that can be implemented with arbitrarily small error. The algorithm is envisioned for fully error-corrected quantum computers but may be implementable on near-term machines. We discuss how this algorithm can be used for kernel-based quantum machine-learning.
Created At: 18 January 2025
Updated At: 18 January 2025
Quantum Models of Consciousness from a Quantum Information Science Perspective
Description: This perspective explores various quantum models of consciousness from the viewpoint of quantum information science, offering potential ideas and insights. The models under consideration can be categorized into three distinct groups based on the level at which quantum mechanics might operate within the brain: those suggesting that consciousness arises from electron delocalization within microtubules inside neurons, those proposing it emerges from the electromagnetic field surrounding the entire neural network, and those positing it originates from the interactions between individual neurons governed by neurotransmitter molecules. Our focus is particularly on the Posner model of cognition, for which we provide preliminary calculations on the preservation of entanglement of phosphate molecules within the geometric structure of Posner clusters. These findings provide valuable insights into how quantum information theory can enhance our understanding of brain functions.
Created At: 18 January 2025
Updated At: 18 January 2025
Mission - Impossible Language Models
Description: Chomsky and others have very directly claimed that large language models (LLMs) are equally capable of learning languages that are possible and impossible for humans to learn. However, there is very little published experimental evidence to support such a claim. Here, we develop a set of synthetic impossible languages of differing complexity, each designed by systematically altering English data with unnatural word orders and grammar rules. These languages lie on an impossibility continuum: at one end are languages that are inherently impossible, such as random and irreversible shuffles of English words, and on the other, languages that may not be intuitively impossible but are often considered so in linguistics, particularly those with rules based on counting word positions. We report on a wide range of evaluations to assess the capacity of GPT-2 small models to learn these uncontroversially impossible languages, and crucially, we perform these assessments at various stages throughout training to compare the learning process for each language. Our core finding is that GPT-2 struggles to learn impossible languages when compared to English as a control, challenging the core claim. More importantly, we hope our approach opens up a productive line of inquiry in which different LLM architectures are tested on a variety of impossible languages in an effort to learn more about how LLMs can be used as tools for these cognitive and typological investigations.
Created At: 18 January 2025
Updated At: 18 January 2025