Neural nets capable of text understanding from scratch without previously knowing words and phrases

2. 8. 15 by 
  • ConvNets do not need any knowledge on the syntactic or semantic structure of a language to give good benchmarks text understanding. This evidence is in contrast with various previous approaches where a dictionary of words is a necessary starting point, and usually structured parsing is hard-wired into the model
  • ConvNets are applied to various large-scale datasets, including ontology classification, sentiment analysis, and text categorization. The research shows that temporal ConvNets can achieve astonishing performance without the knowledge of words, phrases, sentences and any other syntactic or semantic structures with regards to a human language. Evidence shows that the models can work for both English and Chinese.
  • It is also worth noting that natural language in its essence is time-series in disguise. Therefore, one natural extended application for our approach is towards time-series data, in which a hierarchical feature extraction mechanism could bring some improvements over the recurrent and regression models used widely today.

Share This Article

Keep up.
Subscribe to our daily newsletter to keep in touch with the subjects shaping our future.
I understand and agree that registration on or use of this site constitutes agreement to its User Agreement and Privacy Policy


Copyright ©, Singularity Education Group All Rights Reserved. See our User Agreement, Privacy Policy and Cookie Statement. The material on this site may not be reproduced, distributed, transmitted, cached or otherwise used, except with prior written permission of Futurism. Fonts by Typekit and Monotype.