A model library for exploring state-of-the-art deep learning topologies and techniques for optimizing Natural Language Processing neural networks
-
Updated
Nov 7, 2022 - Python
A model library for exploring state-of-the-art deep learning topologies and techniques for optimizing Natural Language Processing neural networks
A collection of datasets that pair questions with SQL queries.
A Japanese tokenizer based on recurrent neural networks
Simple Solution for Multi-Criteria Chinese Word Segmentation
A frame-semantic parsing system based on a softmax-margin SegRNN.
Source code for an ACL2017 paper on Chinese word segmentation
BiLSTM-CRF for sequence labeling in Dynet
Source code for an ACL2016 paper of Chinese word segmentation
Code for paper "End-to-End Reinforcement Learning for Automatic Taxonomy Induction", ACL 2018
An Implementation of Transformer (Attention Is All You Need) in DyNet
Deep Recurrent Generative Decoder for Abstractive Text Summarization in DyNet
Transition-based joint syntactic dependency parser and semantic role labeler using a stack LSTM RNN architecture.
Dataset and model for disentangling chat on IRC
Code for the paper "Extreme Adaptation for Personalized Neural Machine Translation"
Source code for the paper "Morphological Inflection Generation with Hard Monotonic Attention"
An attentional NMT model in Dynet
See http://github.com/onurgu/joint-ner-and-md-tagger This repository is basically a Bi-LSTM based sequence tagger in both Tensorflow and Dynet which can utilize several sources of information about each word unit like word embeddings, character based embeddings and morphological tags from an FST to obtain the representation for that specific wor…
DyNet implementation of stack LSTM experiments by Grefenstette et al.
Selective Encoding for Abstractive Sentence Summarization in DyNet
Add a description, image, and links to the dynet topic page so that developers can more easily learn about it.
To associate your repository with the dynet topic, visit your repo's landing page and select "manage topics."