AI News, nicholaslocascio/deep-regex
- On Sunday, June 3, 2018
- By Read More
Our neural model translates natural language queries into regular expressions which embody their meaning.
We also present a methodology for collecting a large corpus of regular expression, natural language pairs using Mechanical Turk and grammar generation.
pip install -r requirements.txt Datasets are provided in 3 folders within /datasets/: KB13, NL-RX-Synth, NL-RX-Turk.
The data is a parallel corpus, so the folder is split into 2 files: src.txt and targ.txt.
- On Friday, January 18, 2019
Anna Rohrbach: Grounding and Generation of Natural Language Descriptions for Images and Videos
Anna Rohrbach Grounding and Generation of Natural Language Descriptions for Images and Videos Abstract: In recent years many challenging problems have ...
Lecture 3 | GloVe: Global Vectors for Word Representation
Lecture 3 introduces the GloVe model for training word vectors. Then it extends our discussion of word vectors (interchangeably called word embeddings) by ...
A Joint Speaker-Listener-Reinforcer Model for Referring Expressions | Spotlight 2-2C
Licheng Yu; Hao Tan; Mohit Bansal; Tamara L. Berg Referring expressions are natural language constructions used to identify particular objects within a scene.
Lecture 16: Dynamic Neural Networks for Question Answering
Lecture 16 addresses the question ""Can all NLP tasks be seen as question answering problems?"". Key phrases: Coreference Resolution, Dynamic Memory ...
The Zipf Mystery
The of and to. A in is I. That it, for you, was with on. As have ... but be they. RELATED LINKS AND SOURCES BELOW! ..
Open Source TensorFlow Models (Google I/O '17)
Come to this talk for a tour of the latest open source TensorFlow models for Image Classification, Natural Language Processing, and Computer Generated ...
Lecture 8: Recurrent Neural Networks and Language Models
Lecture 8 covers traditional language models, RNNs, and RNN language models. Also reviewed are important training problems and tricks, RNNs for other ...
NW-NLP 2018: Adverbial Clausal Modifiers in the LinGO Grammar Matrix
The fifth Pacific Northwest Regional Natural Language Processing Workshop will be held on Friday, April 27, 2018, in Redmond, WA. We accepted abstracts ...
On Characterizing the Capacity of Neural Networks using Algebraic Topology
The learnability of different neural architectures can be characterized directly by computable measures of data complexity. In this talk, we reframe the problem of ...
Lecture 13 | Generative Models
In Lecture 13 we move beyond supervised learning, and discuss generative modeling as a form of unsupervised learning. We cover the autoregressive ...