WORD VECTOR USING R-ADAM OPTIMIZATION FUNCTION

1PALLAVI SAXENA, HARSHIT NAGAR, D. MALATHI

135 Views
42 Downloads
Abstract:

There has always been a strong interest in expressing notations to present vocabulary of a language in linguistics. This process is quite challenging because of high computational requirements, and the dimensionality curse. We use a word to vector approach to process text which can be best defined as a two-layered neural network. A text corpus will be provided as an input and a set of vectors representing features of words will be obtained as an output. It is not a deep neural network which allows it to be trained quickly compared to other neural network architectures. It embeds the words in numeric form which neural networks can consume. Continuous Bag of Words (CBOW) and Skip-gram models for word vector representation show immense success in capturing the semantic and syntactic relationships between words of a given language. These models show immense success in capturing the semantic and syntactic relationships between words of a given language. However, there is still scope of improvement in the domain of vector space representation. We propose to apply the recently introduced rectified Adam optimization function in place of the RMSprop optimization function. The function shows promising outcomes in terms of training times and accuracy for large datasets.

Keywords:

Word2Vec, CBOW, skip-gram, word embeddings, machine translations, text summarization, Rectified Adam optimization function

Paper Details
Month6
Year2020
Volume24
IssueIssue 8
Pages14055-14061

Our Indexing Partners

Scilit
CrossRef
CiteFactor