Skip to main content
SHARE
Publication

Computationally Efficient Learning of Quality Controlled Word Embeddings for Natural Language Processing...

by Mohammed M Alawad, Georgia Tourassi
Publication Type
Conference Paper
Book Title
2019 IEEE Computer Society Annual Symposium on VLSI (ISVLSI 2019)
Publication Date
Page Numbers
134 to 139
Volume
2019
Conference Name
IEEE Computer Society Annual Symposium on VLSI
Conference Location
Miami, Florida, United States of America
Conference Sponsor
IEEE
Conference Date
-

Deep learning (DL) has been used for many natural language processing (NLP) tasks due to its superior performance as compared to traditional machine learning approaches. In DL models for NLP, words are represented using word embeddings, which capture both semantic and syntactic information in text. However, 90-95% of the DL trainable parameters are associated with the word embeddings, resulting in a large storage or memory footprint. Therefore, reducing the number of word embedding parameters is critical, especially with the increase of vocabulary size. In this work, we propose a novel approximate word embeddings approach for convolutional neural networks (CNNs) used for text classification tasks. The proposed approach significantly reduces the number of model trainable parameters without noticeably sacrificing in computing performance accuracy. Compared to other techniques, our proposed word embeddings technique does not require modifications to the DL model architecture. We evaluate the performance of the the proposed word embeddings on three classification tasks using two datasets, composed of Yelp and Amazon reviews. The results show that the proposed method can reduce the number of word embeddings parameters by 98% and 99% for the Yelp and Amazon datasets respectively, with no drop in computing accuracy.