Posted on

July 2022: Computing Electronic Correlation Energies using Linear Depth Quantum Circuits

Computing Electronic Correlation Energies using Linear Depth Quantum Circuits
Chong Hian Chee, Adrian M. Mak, Daniel Leykam, Dimitris G. Angelakis
arXiv:2207.03949

Efficient computation of molecular energies is an exciting application of quantum computers, but current noisy intermediate-scale quantum (NISQ) devices can only execute shallow circuits, limiting existing quantum algorithms to small molecules. Here we demonstrate a variational NISQ-friendly algorithm for computing electronic correlation energies perturbatively, trading deep circuits in exchange for more shallow circuits with depth linear in the number of qubits. We tested the algorithm on several small molecules, both with classical simulations including noise models and on cloud quantum processors, showing that it not only reproduces the equilibrium molecular energies but it also captures the perturbative electronic correlation effects at longer bond distances. As fidelities of quantum processors continue to improve our algorithm will enable the study of larger molecules compared to existing approaches with higher-order polynomial circuit depth.

Posted on

June 2022: Topological data analysis and machine learning

Topological data analysis and machine learning
Daniel Leykam, Dimitris G. Angelakis
arXiv:2206.15075

Topological data analysis refers to approaches for systematically and reliably computing abstract “shapes” of complex data sets. There are various applications of topological data analysis in life and data sciences, with growing interest among physicists. We present a concise yet (we hope) comprehensive review of applications of topological data analysis to physics and machine learning problems in physics including the detection of phase transitions. We finish with a preview of anticipated directions for future research.

Posted on

July 2021: Fock State-enhanced Expressivity of Quantum Machine Learning Models

Fock State-enhanced Expressivity of Quantum Machine Learning Models
Beng Yee Gan, Daniel Leykam, Dimitris G. Angelakis
EPJ Quantum Technology 9 (1), 16

The data-embedding process is one of the bottlenecks of quantum machine learning, potentially negating any quantum speedups. In light of this, more effective data-encoding strategies are necessary. We propose a photonic-based bosonic data-encoding scheme that embeds classical data points using fewer encoding layers and circumventing the need for nonlinear optical components by mapping the data points into the high-dimensional Fock space. The expressive power of the circuit can be controlled via the number of input photons. Our work shed some light on the unique advantages offers by quantum photonics on the expressive power of quantum machine learning models. By leveraging the photon-number dependent expressive power, we propose three different noisy intermediate-scale quantum-compatible binary classification methods with different scaling of required resources suitable for different supervised classification tasks.