AI News, NIPS Proceedingsβ
- On Wednesday, June 6, 2018
- By Read More
Part of: Advances in Neural Information Processing Systems 27 (NIPS 2014) Recursive Neural Networks have recently obtained state of the art performance on several natural language processing tasks.
We introduce global belief recursive neural networks (GB-RNNs) which are based on the idea of extending purely feedforward neural networks to include one feedbackward step during inference.
The feedbackward step improves F1 performance by 3% over the standard RNN on this task, obtains state-of-the-art performance on the SemEval 2013 challenge and can accurately predict the sentiment of specific entities.
- On Saturday, September 21, 2019
Text Analytics - Ep. 25 (Deep Learning SIMPLIFIED)
Unstructured textual data is ubiquitous, but standard Natural Language Processing (NLP) techniques are often insufficient tools to properly analyze this data.
Lecture 2 | Word Vector Representations: word2vec
Lecture 2 continues the discussion on the concept of representing words as numeric vectors and popular approaches to designing word vectors. Key phrases: ...
Lecture 8: Recurrent Neural Networks and Language Models
Lecture 8 covers traditional language models, RNNs, and RNN language models. Also reviewed are important training problems and tricks, RNNs for other ...
Lecture 16: Dynamic Neural Networks for Question Answering
Lecture 16 addresses the question ""Can all NLP tasks be seen as question answering problems?"". Key phrases: Coreference Resolution, Dynamic Memory ...
Lecture 13: Convolutional Neural Networks
Lecture 13 provides a mini tutorial on Azure and GPUs followed by research highlight "Character-Aware Neural Language Models." Also covered are CNN ...
How to Make a Chatbot - Intro to Deep Learning #12
Only a few days left to signup for my Decentralized Applications course! Lets Make a Question Answering chatbot using the bleeding ..
Predicting the Winning Team with Machine Learning
Only a few days left to signup for my Decentralized Applications course! Can we predict the outcome of a football game given a dataset ..
Lecture 15: Coreference Resolution
Lecture 15 covers what is coreference via a working example. Also includes research highlight "Summarizing Source Code", an introduction to coreference ...
Lecture 9: Machine Translation and Advanced Recurrent LSTMs and GRUs
Lecture 9 recaps the most important concepts and equations covered so far followed by machine translation and fancy RNN models tackling MT. Key phrases: ...
Lecture 18: Tackling the Limits of Deep Learning for NLP
Lecture 18 looks at tackling the limits of deep learning for NLP followed by a few presentations.