Character recurrent neural networks have become very popular in recent years in the deep learning community for learning probability distributions over sequences of characters. More concretely, the probably of an “o” given the sequence of letters “hell” . There are many ways to model sequences of characters, and it does not have to be accomplished with either a recurrent neural network or on a character level. E.g., simple markov models can perform reasonably in this context (reasonably not defined.