Context Prediction Using Neural Networks

1Rohith Manda, Deeba K, S Selvakumar

142 Views
40 Downloads
Abstract:

Word-prediction can be thought of as predicting a word, given its context. It is similar to predictive text used by multiple devices. The concept of word-embeddings is ideal for this type of a problem. Embeddings are an alternate form of word representation, such that it preserves the syntactic and semantic information. It extends the functionality to extract or study the relationships between various words, given in a corpus. The objective can be viewed as, (i) generation of quality embeddings,(ii) using neural networks to realise a model, capable of predicting words accurately, given the context as input. The generation of embeddings are a tedious task and requires a good amount of computation. To use these in an environment that is supported by machine learning algorithms requires transformation of these embeddings and corpus into vector space, i.e, a numeric form of representation. Word prediction can be of many types, depending on the model. For example, if the objective is to predict the (n+1)th word, given the previous n words; The automaton can consider only the previous n or (n,n-1) or (n,n-1,n-2…) words for predicting the next word, the earlier two are referred as bigram and trigram model respectively.

Keywords:

Prediction, Neural Networks.

Paper Details
Month3
Year2020
Volume24
IssueIssue 6
Pages4151-4159