Optimizing Noncontextual Representations in GPTs
ORAL
Abstract
Generalized contextuality, as introduced by Spekkens [Phys. Rev. A 71, 052108 (2005)] and referred to here simply as contextuality, is a foundational nonclassical feature of quantum theory and is increasingly regarded as a vital resource for quantum computation and communication. There have been several approaches to detecting contextuality in generalized probabilistic theories (GPTs), particularly in Selby et al. [Phys. Rev. Lett. 132, 050202 (2024)], but it remained unknown how to systematically find the smallest noncontextual ontological model of a GPT if one exists, where an ontological model of a GPT refers to a model that reproduces the outcome statistics of the GPT, and the size of the ontological model refers its number of ontic states. We use a matrix of outcome probabilities as in Shahandeh et al. [arXiv:2406.19382 (2024)] to represent the outcome probabilities of the GPT. From this matrix we show that the smallest ontological model is given by the nonnegative rank, and the smallest noncontextual model by the doubly restricted nonnegative rank (DRNNR). We extend the work of Vavasis [SIAM J. Optim., pp. 1364-1377 (2009)] to map the DRNNR problem to a geometric problem with a known algorithmic solution. We then give an explicit example of a GPT whose smallest contextual ontological model is smaller than any noncontextual model. This questions whether generalized contextuality is the most natural notion of nonclassicality.
–
Publication: We plan to submit this work under the title, "Geometric Interpretation of Contextuality" to PRX Quantum.
Presenters
-
Theodoros Yianni
Royal Holloway, University of London
Authors
-
Theodoros Yianni
Royal Holloway, University of London
-
Farid Shahandeh
Royal Holloway University of London