Skip to content

ajendrosch/RNN-LM

Repository files navigation

RNN-LM

Awesome Tutorials I used: http://www.wildml.com/2015/09/recurrent-neural-networks-tutorial-part-2-implementing-a-language-model-rnn-with-python-numpy-and-theano/ https://github.com/gwtaylor/theano-rnn

For a seminar about "Selected Topics in Human Language Technology and Pattern Recognition" I created a Bigram- and RNN-LM, trained it on my elaboration and let it reproduce text.

Here some results:

Bigram created Sentences:

SENTENCE_START [ 1 , 1994references 23 ) then is the wsj - kn5 + 1 n is visualized in figure 7 ] , so in one is , enabling more complex pattern than the nn-lm exploits the less data [ 11 ] where s ) since the quadratic cost ( t − 1 represent each weight matrices , nor to feed forward neural network [ 0 and extensions of the architectural ability , named a bigram example p m 2 propagate error : r 7→ [ 15 ] to labeled training neural networks are the translation developed out when learning algorithm , it refers to 20 ] SENTENCE_END SENTENCE_START computational linguistics , 11th annual conference on neural network the amount of machine translation for example per word feature vectors together at every vector s roossin SENTENCE_END SENTENCE_START 5 : 1 − u ji ) ( asru ) = p z ( iamgoinghouse ) = n ) w 2 ) p ( t x 1 if it calculates the hidden layer x i |c i = u ( high weight matrices SENTENCE_END SENTENCE_START the output , the application of the distance , 2010 SENTENCE_END SENTENCE_START ( t ) ( x ( 23 ( t ) depending on supervised learning and extensions SENTENCE_END SENTENCE_START there has an analog value and the sentence the decision tree lm , to zero SENTENCE_END SENTENCE_START the dependencies with the error back in a lower dimensional machine translation [ 12 ] SENTENCE_END SENTENCE_START but because it is complicating the hidden layer SENTENCE_END SENTENCE_START the input vector to recognize similarities and jean-luc gauvain , t − x i ( t ) otherwise , that the n-gram model SENTENCE_END SENTENCE_START another example [ 15 ] SENTENCE_END

RNN-LM created Sentences:

SENTENCE_START first analog 19 ptb detail under SENTENCE_END SENTENCE_START combining were con- annual r ”encrypt” SENTENCE_END SENTENCE_START labeled kind using icassp the not SENTENCE_END SENTENCE_START 4 be is are [ 4 SENTENCE_END SENTENCE_START can an machine network a methods SENTENCE_END SENTENCE_START scientific distributed 36 called schmidhuber annual probabilities is america \ roughly gate SENTENCE_END SENTENCE_START gates impact the for real SENTENCE_END SENTENCE_START art data or weights paragraphs16 mercer effort a SENTENCE_END SENTENCE_START putting remembered however all done recurent unit nlp often , SENTENCE_END SENTENCE_START complete ralf rnn-lms talk linguistics [ network23 ] SENTENCE_END

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages