“The true literature machine will be one that itself feels the need to produce disorder, as
preceding production of order: a machine that will produce avant-garde work to free its circuits when they
are choked by
too long a procession of classicism.”
Search for .epub files on Library
Add a Project Gutenburg or Archive.org url.
A random Project Gutenburg document will be selected.
A Markov chain is a stochastic model describing a sequence of possible events in which the probability of
depends only on the state attained in the previous event. In the case of text generation, these
determined by word order of an input, which can be broken down into n-grams (units) of 1, 2, or 3 words.
n-gram model, Markov text generation begins by selecting random word which begins a sentence and then
selects a random
word which followed that word, weighted by frequency of appearance: for instance, if in the input "The" is
"rhombus" thrice and "skeleton" once, there is a 75% chance the algorithm will select rhombus next and a 25%
skeleton will be selected. If skeleton is selected, the word which follows skeleton in the original text
next, since skeleton appeared only once, but if rhombus is selected, then a random word which follows
rhombus in the
original passage will appear. The order 2 n-gram variant would proceed similarly, except it would start by
selecting "The rhombus" or "The skeleton" and then, if "The rhombus" was selected, randomly select a word
By running multiple texts together through a Markov chain at the same time, it becomes possible to collage
bits of text
in novel, often nonsensical ways that often rebel from syntax and invert idiomatic constructions. (Software,
has no subconscious to veto awkward constructions before they arise to conscious thought.)
There are several
to this technique. The cut-up technique, pioneered by the Dadaists and further developed by William S.
Brion Gysin, rearranged cut-up blocks of text from multiple pages to invent a new body of text. In the
programmers realized the parodic potential of Markov chain algorithms when applied to writing and released
Press, a plug-in for the eMacs text editor. Following in their footsteps, Jamie Zawinski released the
dadadodo which generates
text from an input file.
With Markov Mutagen, you can easily combine and run inputs through a Markov text generator or a simulation
of the cut-up
technique. You are encouraged to combine, edit, and collage the outputs or even recycle the output as an
Cybernetic writing is an open-ended game whose procedures are still being developed and combined in new
sampling text from a novel, news article, encyclopedia, conspiracy theory, theological treatise, or whatever
strikes your fancy. Try replacing a word (noun, verb, adjective) in one input with a word from another to
probabilities in the Markov model. The outputs of neural networks like TalkToTransformer also serve well as
enjoy the fruits of your discordant prosody.