7 results for Abbott, A.A.

On Demons and Oracles
Abbott, A.A.; Calude, C.S.; Svozil, K. (2011)
Report
The University of Auckland LibraryThis paper presents a personal, subjective overview of ChurchTuring Thesis and some attempts to trespass the Turing barrier.
View record details 
A Quantum Random Number Generator Certified by Value Indefiniteness
Abbott, A.A.; Calude, C.S.; Svozil, K. (2010)
Report
The University of Auckland LibraryIn this paper we propose a quantum random number generator (QRNG) which utilises an entangled photon pair in a Bell singlet state, and is certiﬁed explicitly by value indeﬁniteness. While “true randomness” is a mathematical impossibility, the certiﬁcation by value indeﬁniteness ensures the quantum random bits are incomputable in the strongest sense. This is the ﬁrst QRNG setup in which a physical principle (KochenSpecker value indeﬁniteness) guarantees that no single quantum bit produced can be classically computed (reproduced and validated), the mathematical form of bitwise physical unpredictability. The effects of various experimental imperfections are discussed in detail, particularly those related to detector efﬁciencies, context alignment and temporal correlations between bits. The analysis is to a large extent relevant for the construction of any QRNG based on beamsplitters. By measuring the two entangled photons in maximally misaligned contexts and utilising the fact that two rather than one bitstring are obtained, more efﬁcient and robust unbiasing techniques can be applied. A robust and efﬁcient procedure based on XORing the bitstrings together—essentially using one as a onetimepad for the other—is proposed to extract random bits in the presence of experimental imperfections, as well as a more efﬁcient modiﬁcation of the von Neumann procedure for the same task. Some open problems are also discussed.
View record details 
The DeutschJozsa Problem: Dequantization and Entanglement
Abbott, A.A. (2009)
Report
The University of Auckland LibraryThe DeustchJozsa problem is one of the most basic ways to demonstrate the power of quantum computation. Consider a Boolean function f : {0, 1}n → {0, 1} and suppose we have a blackbox to compute f. The DeutschJozsa problem is to determine if f is constant (i.e. f(x) = const, ∀x ∈ {0, 1}n) or if f is balanced (i.e. f(x) = 0 for exactly half the possible input strings x ∈ {0, 1}n) using as few calls to the blackbox computing f as is possible, assuming f is guaranteed to be constant or balanced. Classically it appears that this requires at least 2n−1 + 1 blackbox calls in the worst case, but the well known quantum solution solves the problem with probability one in exactly one blackbox call. It has been found that in some cases the algorithm can be dequantised into an equivalent classical, deterministic solution. We explore the ability to extend this dequantisation to further cases, and examine with more detail when dequantisation is possible, both with respect to the DeutschJozsa problem, as well as in more general blackbox algorithms.
View record details 
A Nuclear Magnetic Resonance Implementation of a Classical DeutschJozsa Algorithm
Abbott, A.A.; Bechmann, M.; Calude, C.S.; Sebald, a. (2011)
Report
The University of Auckland LibraryNuclear magnetic resonance (NMR) has been widely used as a demonstrative medium for showcasing the ability for quantum computations to outperform classical ones. A large number of such experiments performed have been implementations of the DeutschJozsa algorithm. It is known, however, that in some cases the DeutschJozsa problem can be solved classically using as many queries to the blackbox as in the quantum solution. In this paper we describe experiments in which we take the contrasting approach of using NMR as a classical computing medium, treating the nuclear spin vectors classically and utilising an alternative embedding of bits into the physical medium. This allows us to determine the actual Boolean function computed by the blackbox for the n = 1, 2 cases, as opposed to only the nature (balanced or constant) as conventional quantum algorithms do. Discussion of these experiments leads to some clarification of the complications surrounding the comparison of different quantum algorithms, particularly blackbox type algorithms.
View record details 
Dequantisation of the Quantum Fourier Transform
Abbott, A.A. (2010)
Report
The University of Auckland LibraryThe quantum Fourier transform (QFT) plays an important role in many known quantum algorithms such as Shor’s algorithm for prime factorisation. In this paper we show that the QFT algorithm can, on a restricted set of input states, be dequantised into a classical algorithm which is both more eﬃcient and simpler than the quantum algorithm. By working directly with the algorithm instead of the circuit, we develop a simple classical version of the quantum basisstate algorithm. We formulate conditions for a separable state to remain separable after the QFT is performed, and use these conditions to extend the dequantised algorithm to work on all such states without loss of eﬃciency. Our technique highlights the linearity of quantum mechanics as the fundamental feature accounting for the diﬀerence between quantum and dequantised algorithms, and that it is this linearity which makes the QFT such a useful tool in quantum computation.
View record details 
Von Neumann Normalisation of a Quantum Random Number Generator
Abbott, A.A.; Calude, C.S. (2010)
Report
The University of Auckland LibraryIn this paper we study von Neumann unbiasing normalisation for ideal and real quantum random number generators, operating on ﬁnite strings or inﬁnite bit sequences. In the ideal cases one can obtain the desired unbiasing. This relies critically on the independence of the source, a notion we rigorously deﬁne for our model. In real cases, affected by imperfections in measurement and hardware, one cannot achieve a true unbiasing, but, if the bias “drifts sufficiently slowly”, the result can be arbitrarily close to unbiasing. For inﬁnite sequences, normalisation can both increase or decrease the (algorithmic) randomness of the generated sequences. A successful application of von Neumann normalisation—in fact, any unbiasing transformation—does exactly what it promises, unbiasing, one (among inﬁnitely many) symptoms of randomness; it will not produce “true” randomness.
View record details 
Understanding the Quantum Computational Speedup via Dequantisation
Abbott, A.A.; Calude, C.S. (2010)
Report
The University of Auckland LibraryWhile it seems possible that quantum computers may allow for algorithms oﬀering a computational speedup over classical algorithms for some problems, the issue is poorly understood. We explore this computational speedup by investigating the ability to dequantise quantum algorithms into classical simulations of the algorithms which are as eﬃcient in both time and space as the original quantum algorithms. The process of dequantisation helps formulate conditions to determine if a quantum algorithm provides a real speedup over classical algorithms. These conditions can be used to develop new quantum algorithms more eﬀectively (by avoiding features that could allow the algorithm to be eﬃciently classically simulated) and to create new classical algorithms (by using features which have proved valuable for quantum algorithms). Results on many diﬀerent methods of dequantisations are presented, as well as a general formal deﬁnition of dequantisation. Dequantisations employing higherdimensional classical bits, as well as those using matrixsimulations, put emphasis on entanglement in quantum algorithms; a key result is that any algorithm in which the entanglement is bounded is dequantisable. These methods are contrasted with the stabiliser formalism dequantisations due to the GottesmanKnill Theorem, as well as those which take advantage of the topology of the circuit for a quantum algorithm. The beneﬁts and limits of the diﬀerent methods are discussed, and the importance of utilising a range of techniques is emphasised. We further discuss some features of quantum algorithms which current dequantisation methods do not cover and highlight several important open questions in the area.
View record details