Faster quantum state decomposition with Tucker tensor approximation

被引:0
|
作者
Protasov Stanislav
Lisnichenko Marina
机构
[1] Innopolis University,Machine Learning and Knowledge Representation Lab
来源
关键词
Quantum state preparation; Tensor decomposition; NISQ; Tucker decomposition;
D O I
暂无
中图分类号
学科分类号
摘要
Researchers have put a lot of effort into reducing the gap between current quantum processing units (QPU) capabilities and their potential supremacy. One approach is to keep supplementary computations in the CPU, and use the QPU only for the core of the problem. In this work, we address the complexity of quantum algorithms of arbitrary quantum state initialization. QPUs do not outperform classical machines with existing precise initialization algorithms. Hence, many studies propose an approximate but robust quantum state initialization. Cutting a quantum state into a product of (almost) independent partitions with the help of CPU reduces the number of two-qubit gates, and correspondingly minimizes the loss of state fidelity in the quantum part of the algorithm. To find the least entangled qubits, current methods compute the singular value decomposition (SVD) for each qubit separately using the CPU. In this paper, we optimize CPU usage and memory resource bottlenecks. We consider Tucker tensor decomposition as an alternative to the CPU-based SVD in a single low-entangled qubit detection task without loss of solution quality. An iterative implementation of Tucker tensor decomposition replaces explicit applications of SVD as proposed in Araujo et al. (2021). This improvement gives both a theoretical and practical time complexity reduction for the circuit-preparation part of quantum algorithms working with vector data. We propose two implementations of our method; both of them outperform the SVD in time and memory for systems of at least ten qubits. We achieve an order faster implementation and two orders less memory usage for a system of 15 qubits.
引用
收藏
相关论文
共 50 条
  • [31] Graph regularized discriminative nonnegative tucker decomposition for tensor data representation
    Jing, Wenjing
    Lu, Linzhang
    Liu, Qilong
    APPLIED INTELLIGENCE, 2023, 53 (20) : 23864 - 23882
  • [32] Large-scale tucker Tensor factorization for sparse and accurate decomposition
    Jun-Gi Jang
    Moonjeong Park
    Jongwuk Lee
    Lee Sael
    The Journal of Supercomputing, 2022, 78 : 17992 - 18022
  • [33] Tucker-1 Boolean Tensor Factorization with Quantum Annealers
    O'Malley, Daniel
    Djidjev, Hristo N.
    Alexandrov, Boian S.
    2020 INTERNATIONAL CONFERENCE ON REBOOTING COMPUTING (ICRC 2020), 2020, : 58 - 65
  • [34] Tucker Decomposition Based on a Tensor Train of Coupled and Constrained CP Cores
    Giraud, Maxence
    Itier, Vincent
    Boyer, Remy
    Zniyed, Yassine
    de Almeida, Andre L. F.
    IEEE SIGNAL PROCESSING LETTERS, 2023, 30 : 758 - 762
  • [35] Hyperspectral image, video compression using sparse tucker tensor decomposition
    Das, Samiran
    IET IMAGE PROCESSING, 2021, 15 (04) : 964 - 973
  • [36] Graph regularized discriminative nonnegative tucker decomposition for tensor data representation
    Wenjing Jing
    Linzhang Lu
    Qilong Liu
    Applied Intelligence, 2023, 53 : 23864 - 23882
  • [37] Large-scale tucker Tensor factorization for sparse and accurate decomposition
    Jang, Jun-Gi
    Park, Moonjeong
    Lee, Jongwuk
    Sael, Lee
    JOURNAL OF SUPERCOMPUTING, 2022, 78 (16): : 17992 - 18022
  • [38] Tucker decomposition-based tensor learning for human action recognition
    Zhang, Jianguang
    Han, Yahong
    Jiang, Jianmin
    MULTIMEDIA SYSTEMS, 2016, 22 (03) : 343 - 353
  • [39] A tensor compression algorithm using Tucker decomposition and dictionary dimensionality reduction
    Gan, Chenquan
    Mao, Junwei
    Zhang, Zufan
    Zhu, Qingyi
    INTERNATIONAL JOURNAL OF DISTRIBUTED SENSOR NETWORKS, 2020, 16 (04):
  • [40] Tucker decomposition-based tensor learning for human action recognition
    Jianguang Zhang
    Yahong Han
    Jianmin Jiang
    Multimedia Systems, 2016, 22 : 343 - 353