Deep machine learning-A new frontier in artificial intelligence research

Research output: Contribution to journalArticlepeer-review

1090 Scopus citations

Abstract

Mimicking the efficiency and robustness by which the human brain represents information has been a core challenge in artificial intelligence research for decades. Humans are exposed to myriad of sensory data received every second of the day and are somehow able to capture critical aspects of this data in a way that allows for its future use in a concise manner. Over 50 years ago, Richard Bellman, who introduced dynamic programming theory and pioneered the field of optimal control, asserted that high dimensionality of data is a fundamental hurdle in many science and engineering applications. The main difficulty that arises, particularly in the context of pattern classification applications, is that the learning complexity grows exponentially with linear increase in the dimensionality of the data. He coined this phenomenon the curse of dimensionality [1]. The mainstream approach of overcoming the curse has been to pre-process the data in a manner that would reduce its dimensionality to that which can be effectively processed, for example by a classification engine. This dimensionality reduction scheme is often referred to as feature extraction. As a result, it can be argued that the intelligence behind many pattern recognition systems has shifted to the human-engineered feature extraction process, which at times can be challenging and highly application-dependent [2]. Moreover, if incomplete or erroneous features are extracted, the classification process is inherently limited in performance.

Original languageEnglish
Article number12
Pages (from-to)13-18
Number of pages6
JournalIEEE Computational Intelligence Magazine
Volume5
Issue number4
DOIs
StatePublished - Nov 2010
Externally publishedYes

Fingerprint

Dive into the research topics of 'Deep machine learning-A new frontier in artificial intelligence research'. Together they form a unique fingerprint.

Cite this