Friday, 12 July 2019

A multi-representational convolutional neural network architecture for text classification

Over the past decade or so, convolutional neural networks (CNNs) have proven to be very effective in tackling a variety of tasks, including natural language processing (NLP) tasks. NLP entails the use of computational techniques to analyze or synthesize language, both in written and spoken form. Researchers have successfully applied CNNs to several NLP tasks, including semantic parsing, search query retrieval and text classification.

* This article was originally published here