Inputs





Search for .epub files on Library Genesis

URL:

Add a Project Gutenburg or Archive.org url.

A random Project Gutenburg document will be selected.





random paragraphs
random sentences





Block Size Between:
and words

Outputs

About Markov Chains


A Markov chain is a stochastic model describing a sequence of possible events in which the probability of each event depends only on the state attained in the previous event. In the case of text generation, these probabilities are determined by word order of an input, which can be broken down into n-grams (units) of 1, 2, or 3 words.

For the order-1 n-gram model, Markov text generation begins by selecting random word which begins a sentence and then selects a random word which followed that word, weighted by frequency of appearance: for instance, if in the input "The" is followed by "rhombus" thrice and "skeleton" once, there is a 75% chance the algorithm will select rhombus next and a 25% chance skeleton will be selected. If skeleton is selected, the word which follows skeleton in the original text will appear next, since skeleton appeared only once, but if rhombus is selected, then a random word which follows rhombus in the original passage will appear. The order 2 n-gram variant would proceed similarly, except it would start by randomly selecting "The rhombus" or "The skeleton" and then, if "The rhombus" was selected, randomly select a word which followed "The rhombus."


Use in Literature


“Unfortunately human effort, which always varies the arrangement of existing elements, cannot be applied to producing a single new element. A landscape in which nothing terrestrial figures is beyond the scope of our imagination.”
— André Breton, "Max Ernst" (1921)

By running multiple texts together through a Markov chain at the same time, it becomes possible to collage bits of text in novel, often nonsensical ways that often rebel from syntax and invert idiomatic constructions. (Software, after all, has no subconscious to veto awkward constructions before they arise to conscious thought.)

There are several precedents to this technique. The cut-up technique, pioneered by the Dadaists and further developed by William S. Burroughs and Brion Gysin, rearranged cut-up blocks of text from multiple pages to invent a new body of text. In the 1980s, programmers realized the parodic potential of Markov chain algorithms when applied to writing and released Dissociated Press, a plug-in for the eMacs text editor. Following in their footsteps, Jamie Zawinski released the C program dadadodo which generates text from an input file.


Markov Mutagen


“But the marvelous abiity to reach out, without leaving the field of our experience, to two distinct realities and bring them together to create a spark...and, by removing our systems of reference, to disorient us within our own memories, that is what holds Dada's attention, for the time being.”
— André Breton, "Max Ernst" (1921)

With Markov Mutagen, you can easily combine and run inputs through a Markov text generator or a simulation of the cut-up technique. You are encouraged to combine, edit, and collage the outputs or even recycle the output as an input. Cybernetic writing is an open-ended game whose procedures are still being developed and combined in new ways.

So try sampling text from a novel, news article, encyclopedia, conspiracy theory, theological treatise, or whatever else strikes your fancy. Try replacing a word (noun, verb, adjective) in one input with a word from another to manipulate the probabilities in the Markov model. The outputs of neural networks like TalkToTransformer also serve well as inputs. And enjoy the fruits of your discordant prosody.