TY - GEN
T1 - Computationally Efficient Learning of Quality Controlled Word Embeddings for Natural Language Processing
AU - Alawad, Mohammed
AU - Tourassi, Georgia
N1 - Publisher Copyright:
© 2019 IEEE.
PY - 2019/7
Y1 - 2019/7
N2 - Deep learning (DL) has been used for many natural language processing (NLP) tasks due to its superior performance as compared to traditional machine learning approaches. In DL models for NLP, words are represented using word embeddings, which capture both semantic and syntactic information in text. However, 90-95% of the DL trainable parameters are associated with the word embeddings, resulting in a large storage or memory footprint. Therefore, reducing the number of word embedding parameters is critical, especially with the increase of vocabulary size. In this work, we propose a novel approximate word embeddings approach for convolutional neural networks (CNNs) used for text classification tasks. The proposed approach significantly reduces the number of model trainable parameters without noticeably sacrificing in computing performance accuracy. Compared to other techniques, our proposed word embeddings technique does not require modifications to the DL model architecture. We evaluate the performance of the the proposed word embeddings on three classification tasks using two datasets, composed of Yelp and Amazon reviews. The results show that the proposed method can reduce the number of word embeddings parameters by 98% and 99% for the Yelp and Amazon datasets respectively, with no drop in computing accuracy.
AB - Deep learning (DL) has been used for many natural language processing (NLP) tasks due to its superior performance as compared to traditional machine learning approaches. In DL models for NLP, words are represented using word embeddings, which capture both semantic and syntactic information in text. However, 90-95% of the DL trainable parameters are associated with the word embeddings, resulting in a large storage or memory footprint. Therefore, reducing the number of word embedding parameters is critical, especially with the increase of vocabulary size. In this work, we propose a novel approximate word embeddings approach for convolutional neural networks (CNNs) used for text classification tasks. The proposed approach significantly reduces the number of model trainable parameters without noticeably sacrificing in computing performance accuracy. Compared to other techniques, our proposed word embeddings technique does not require modifications to the DL model architecture. We evaluate the performance of the the proposed word embeddings on three classification tasks using two datasets, composed of Yelp and Amazon reviews. The results show that the proposed method can reduce the number of word embeddings parameters by 98% and 99% for the Yelp and Amazon datasets respectively, with no drop in computing accuracy.
KW - Approximate computing
KW - convolutional neural networks
KW - natural language processing
KW - word embeddings
UR - http://www.scopus.com/inward/record.url?scp=85072958588&partnerID=8YFLogxK
U2 - 10.1109/ISVLSI.2019.00033
DO - 10.1109/ISVLSI.2019.00033
M3 - Conference contribution
AN - SCOPUS:85072958588
T3 - Proceedings of IEEE Computer Society Annual Symposium on VLSI, ISVLSI
SP - 134
EP - 139
BT - Proceedings - 2019 IEEE Computer Society Annual Symposium on VLSI, ISVLSI 2019
PB - IEEE Computer Society
T2 - 18th IEEE Computer Society Annual Symposium on VLSI, ISVLSI 2019
Y2 - 15 July 2019 through 17 July 2019
ER -