Quantum annealing for real-world machine learning applications

Rajdeep Kumar Nath, Himanshu Thapliyal, Travis S. Humble

Research output: Chapter in Book/Report/Conference proceedingChapterpeer-review

Abstract

Optimizing the training of a machine learning pipeline is important for reducing training costs and improving model performance. One such optimizing strategy is quantum annealing, which is an emerging computing paradigm that has shown potential in optimizing the training of a machine learning model. The implementation of a physical quantum annealer has been realized by D-Wave systems and is available to the research community for experiments. Recent experimental results on a variety of machine learning applications have shown interesting results especially under the conditions where the performance of classical machine learning techniques are limited such as limited training data and high dimensional features. This chapter explores the application of D-Wave's quantum annealer for optimizing machine learning pipelines for real-world classification problems. We review the application domains on which a physical quantum annealer has been used to train machine learning classifiers. We discuss and analyze the experiments performed on the D-Wave quantum annealer for applications such as image recognition, remote sensing imagery, security, computational biology, biomedical sciences, and physics. We discuss the possible advantages and the problems for which quantum annealing is likely to be advantageous over classical computation.

Original languageEnglish
Title of host publicationQuantum Computing
Subtitle of host publicationCircuits, Systems, Automation and Applications
PublisherSpringer International Publishing
Pages157-180
Number of pages24
ISBN (Electronic)9783031379666
ISBN (Print)9783031379659
DOIs
StatePublished - Nov 24 2023

Keywords

  • Classification
  • Machine learning
  • Optimization
  • Quantum annealing
  • Quantum computing

Fingerprint

Dive into the research topics of 'Quantum annealing for real-world machine learning applications'. Together they form a unique fingerprint.

Cite this