Stephen King Text Generation with Artificial Intelligence (RNN), Using Python

Original Source Here

Stephen King Text Generation with Artificial Intelligence (RNN), Using Python

Here’s how I trained a Deep Learning architecture to write a Stephen King-like text.

I love reading, and I’m a big fan of Stephen King. Also, I’m a physicist and a data scientist, and I’m currently taking a break after I got my Master’s Degree. During this pause, I’ve decided to read another Stephen King book, and I really loved it.

I would never, ever, EVER, let a computer write a Stephen King book for one simple reason.

We are not there with the technology yet.

If you look at some NLP experiment (e.g. AI writing jokes here) you can safely say that we are not as good as humans in writing.

So you may ask “Why are you telling me that we are not able to let AI generate text in an blog post that explains how to generate text with AI?”. Well, the answer is this:

While it is true that we can’t let a computer write an entire book (for now) , AI is still able to suggest plots, gain insights and help humans in writing.

Actually, it does these things extremely well!

Thus, I’ve decided to take a bunch of Stephen King texts and use an AI (Deep Learning) algorithm to write a small new one. This is the game plan:

  1. The Theory
  2. The Code
  3. The Results

I understand that some of you might be interested in the results only, or in the theoretical part only, or in the code only. Each part is self consistent, so feel free to skip to your favourite part.

Let’s dive in!

1.The theory

Note: This is a brief overview of the required theory that is behind the AI structure. You should refer to different sources or books if you feel you need a more solid theoretical background.

When we say AI we say a lot of things. On the other hand, when talking about “text generation” we might want to refer to Deep Learning. Deep Learning is a special kind of AI that is able to complete its task by performing a layered learning. Let’s make it easy. If you want to recognize if an animal in front of you is a cat, you would probably start by looking at the colours, then its shape, then its eyes, and so on and so forth. So you basically start from simple concepts and you build more complex concepts out of the simpler ones. For this reason, we say that the learning is layered (i.e. the input is processed by various layers).

Let’s say that the input is an image made of n = N x N pixels.
This input is processed by various layers, and then the result is 1 if the animal is a cat and 0 if it is not. Pretty simple, right?

An important thing to know is that this structure needs to be trained. That means that you have to use tons of labelled images of cats and non-cats (idk, dogs?) and build a learning process to make the machine learn how to perform its task.

Now, let’s talk about business. When we talk about text generation we have to keep in mind two important concepts:

A) The Word Embedding

B) The Recurrent Neural Networks

The Word Embedding is the part where the Machine convert a text (a string) in a vector (a sequence of numbers).

The Recurrent Neural Networks are special Deep Learning structures. This structure is able to collect all the previous inputs of a sequential order (like a sentence) to predict the probability of getting a certain output. In particular, there are a Vocabulary that contains all the words (let’s say there are V words) of the texts and a Logit output. This one is a sequence of V numbers in a [0,1] range whose sum is 1. Each number (let’s say j) of the sequence tells you the probability of obtaining the j word after the input sequence.

Everything that has been explained so far is summarized in the following scheme:


2. The Code

Note: This part is inspired and adapted from this source

This may surprises you, but the code part is extremely simple.

These are the library you will need:

All the Stephen King texts I used can be found here.
After you got them, it is easy to use them with some pretty simple Python ordinary lines of code:

These are the lines of code to run in order to adapt the text to TensorFlow and create the dictionary:

Building the model:

Training the model:

Getting the trained model:

And here is the generate text function:

That can be used to actually generate text with this simple lines (provided that an input string is given):

E.g. “After the” has been given as an input sequence.

3The Results

The AI structure that is described so far can in principle generate an infinite text from a single input. Of course this would result in a bad performance, as it will just end up repeating the same words again and again. For this reason, it is safe to generate a single sentence out of an input. It is important to notice that there is a parameter, called t (temperature) that permits you to set the creativity of the algorithm. t= 0 means repeating the same words of the dataset, t=1 means inventing non-english sentences or words. A good balance between these twos is the key.

Let’s show you some results!

First Example (t = 0.1, Medium Input) :

Input: "After the"
Output: "After the sun was warm on our faces we were both wearing sweater (.) You know how confused dreams are logic like"

Second Example (t=0.1, Long Input):

Input: "So I decided"
Output: "So I decided the ordinary was going on. I think this was because in my heart I believed that such conditions only"

Third Example (t=0.7, Short Input):

Input: "I"
Output: "I fell in love to the sound of breaking glass edged it liked jagged lace. The two ved her be a fluke I r"

Fourth Example (t=0.2, Short Input):

Input: "He"
Output: "He who spotted her. He recognized her by her red handbag spread out in front of me and it was like takin"

Some Fun (wordcloud):

The bigger the word, the most frequent it is.

Image made by me with Python


As it is possible to see, we are not able, with this technology to implement a full consistent Stephen King text generator. Nonetheless, the Stephen King fans might realize that there is something that reminds them of Stephen King in the generated text. This comes with no surprise, as the AI algorithm has “seen” the dataset that is made of Stephen King texts. On the other hand, it is surprising as hell, as the algorithm has invented a new Stephen King like (really short) text!

I really hope you enjoyed the article. Comments, corrections and thoughts are well appreciated here:

Happy Stephen King to anyone!


Trending AI/ML Article Identified & Digested via Granola by Ramsey Elbasheer; a Machine-Driven RSS Bot

%d bloggers like this: