Recurrent Neural Networks (RNN) are a class of machine learning algorithms modeled after the human brain that work well with sequences of data, like text. Popular series fans, like those entralled in Game of Thrones, are fed up with waiting for next episodes that may not appear for years. Now full-stack software engineer Zack Thoutt is training an RNN to predict the events of the unfinished sixth novel. Training such models requires having labels or target variables related to a general idea of what the model should ideally output. The neural network compares the data it outputs with the targets and updates the network to better mimic the targets. For example, a perfect model would take everything that has happened in the prior books into account, including whether previous characters have already died. The programming technique is called “long short-term memory.” Unfortunately for the impatient reader, the technology to train a perfect text generator that can remember complex plots over millions of words doesn’t exist yet, but the process underway is fun and fascinating.