This project generates texts using character-based RNN, provided the dataset as Gutenberg Project's Holy Bible "http://www.gutenberg.org/cache/epub/10/pg10.txt" .
Given a sequence of characters from this data ("Jesu"), train a model to predict the next character in the sequence ("s"). Longer sequences of text can be generated by calling the model repeatedly.
The following is sample output when the model in this tutorial trained for 10 epochs, and started with the string "GOD":
GOD: But ye are fire.
4:22 Fear my God for Doas, that my soul desireth that he will, and let all natural permission in hindry:
(for thou hast this man to give you thess: for they said, O thou hast kept,
as for I bear not, likewise, draw and enquire honour, and of hypocrisy;
and husbands shall know that he will search for your children: for why doth man
be a royention oor had gotten in his way.
19:13 And now the daughterst that verith, God of God, for the sin offering: for so
was I caused in their goings in the blood of righony perceiver.
5:28 Brethren, be the seen of Jesus Christ, that if is John, whose armourbearer
said, The voice of their hands are suppaised: sin thou not by all, and shall
blossom, and scame tit his wife unto him the inhabitants of Jerusalem.
7:11 And when he was constrained from Judaea we smite to scail, from the door of the te, when I have sold
unto the blessing: I beseech thee this fetheclook not one: 13:6
And shall say to this law house of Judas,
- tensorflow
- numpy
- future
- os
- time
-
While some outputs make no sense and even grammatically, because the model hasn't learn the underlying meaning of the words. But it sure does a good job of recreation of the structure of how bible is written. Like creation of stanzas starting with the <chapter no.>:<verse no.> : /*Followed by the generated verse.*/
-
As it learns what word to use after the previously generated word, the model has quite learnt to use the Old English words like 'thy','thou', 'thess' and so on.
As a DIY, just change the text file in get_file() function and generate the fakes of it! Try messing with the batch size and the size of slice we take as a tensor input. A more detailed fake can be done by increasing your EPOCH values, make sure to use GPU runtime or it might end up being a wait pass end of freaking universe!