Class Representative Learning for Zero-shot Learning Using Purely Visual Data

Research output: Contribution to journalArticlepeer-review

4 Scopus citations

Abstract

The building of robust classifiers with high precision is an important goal. In reality, it is quite challenging to achieve such a goal with the data that are typically noise, sparse, or derived from heterogeneous sources. Thus, a considerable gap exists between a model built with training (seen) data and testing (unseen) data in applications. Recent works, including zero-shot learning (ZSL) and generalized zero-short learning (G-ZSL), have attempted to overcome the apparent gap through transfer learning. However, most of these works are required to build a model using visual input with associated data like semantics, attributes, and textual information. Furthermore, models are made with all of the training data. Thus, these models apply to more generic contexts but do not apply to the specific settings that will eventually be required for real-world applications. In this paper, we propose a novel model named class representative learning (CRL), a class-based classifier designed with the following unique contributions in machine learning: (1) the unique design of a latent feature vector, i.e., class representative, represents the abstract embedding space projects with the features extracted from a deep neural network learned only from input images. (2) Parallel ZSL algorithms with class representative learning; (3) a novel projection-based inferencing method uses the vector space model to reconcile the dominant difference between the seen classes and unseen classes. This study demonstrates the benefit of using the class-based approach with CRs for ZSL and G-ZSL on eight benchmark datasets. Extensive experimental results suggest that our proposed CRL model significantly outperforms the state-of-the-art methods in ZSL/G-ZSL based image classification.

Original languageEnglish
Article number313
JournalSN Computer Science
Volume2
Issue number4
DOIs
StatePublished - Jul 2021

Funding

This work was partially supported by NSF CNS #1747751. The authors would like to thank Dr. Ye Wang, Associate Professor, UMKC, for constructive criticism of the manuscript

Keywords

  • Image classification
  • Transfer learning
  • Zero-shot learning

Fingerprint

Dive into the research topics of 'Class Representative Learning for Zero-shot Learning Using Purely Visual Data'. Together they form a unique fingerprint.

Cite this