Hello everyone,
I went to the International Roguelike Developer Conference (IRDC 2016). It was fun, you can watch it on twitch and they are gonna stream tomorrow (link).
Here is the recap on the talks:
Markov Text Generation:
Caves of Quds text/books/tomes/realms/lore are generated using Markov Chain models. They use Markov Chain models to generate paragraphs (3 to 6 sentences) and books (4 to 8 paragraphs). Some other people do two direction Markov Chain instead of one direction Markov Chain. For book titles they used a template filling like Tracery but this technique is kinda limited so they replaced it by a generated sentence from Markov Chain models with limited length then shove off the unwanted words from the beginning and the end of the sentence. For hidden secrets in the book, he generate all the secrets first then he add them to the Markov Chain model. Also he told to check (The Annals of the Parrigues) by Emily Short.