共 50 条
Supervised-learning guarantee for quantum AdaBoost
被引:0
|作者:
Wang, Yabo
[1
,2
]
Wang, Xin
[3
]
Qi, Bo
[1
,2
]
Dong, Daoyi
[4
]
机构:
[1] Chinese Acad Sci, Acad Math & Syst Sci, Key Lab Syst & Control, Beijing 100190, Peoples R China
[2] Univ Chinese Acad Sci, Beijing 100049, Peoples R China
[3] Tsinghua Univ, Dept Automat, Beijing 100084, Peoples R China
[4] Australian Natl Univ, Sch Engn, Canberra, ACT 2601, Australia
来源:
基金:
澳大利亚研究理事会;
中国国家自然科学基金;
关键词:
Adaptive boosting - Adversarial machine learning - Contrastive Learning - Federated learning - Quantum computers - Quantum electronics - Quantum optics - Semi-supervised learning - Variational techniques;
D O I:
10.1103/PhysRevApplied.22.054001
中图分类号:
O59 [应用物理学];
学科分类号:
摘要:
In the noisy intermediate-scale quantum (NISQ) era, the capabilities of variational quantum algorithms are greatly constrained due to a limited number of qubits and the shallow depth of quantum circuits. We may view these variational quantum algorithms as weak learners in supervised learning. Ensemble methods are general approaches to combining weak learners to construct a strong one in machine learning. In this paper, by focusing on classification, we theoretically establish and numerically verify a learning guarantee for quantum adaptive boosting (AdaBoost). The supervised-learning risk bound describes how the prediction error of quantum AdaBoost on binary classification decreases as the number of boosting rounds and sample size increase. We further empirically demonstrate the advantages of quantum AdaBoost by focusing on a 4-class classification. The quantum AdaBoost not only outperforms several other ensemble methods, but in the presence of noise it can also surpass the ideally noiseless but unboosted primitive classifier after only a few boosting rounds. Our work indicates that in the current NISQ era, introducing appropriate ensemble methods is particularly valuable in improving the performance of quantum machine learning algorithms.
引用
收藏
页数:16
相关论文