My friend Matt Muir compiles a weekly newsletter of articles and links from across the internet (you really should subscribe). His prose style is distinct, and delightful, and the relatively short gobbets of text readily lend themselves to algo-manipulation.
Plus, he has what can only be described as a mild case of Logorrhea – meaning there is plenty of text to mine…
algocurios is powered by a modified version of the openai GPT-2 language prediction model. Basically it’s a text production system which has been trained by reading millions of articles online. In its standard form, it’s an amazing beast, capable of producing ‘plausible sounding’ text in a myriad of forms.
If we take this existing model (which already has an excellent grasp of many forms of written english) and re-train it a little on Matt’s words, it readily produces text with the same kind of tone and vocabulary.
In the case of Matt and his webcurios, it got the gist very quickly and started spitting out new gobbets of text in the curios form almost immediately.
Given more training, the network would undoubtedly begin to memorise and repeat back sections of the source text. However, by just teasing the network with a little new information results in a delightful blend of tone and subject matter.
The delight, for me, is the way it forces the reader to make unexpected metaphorical leaps in an attempt to extract meaning. Often with algorithmically generated text, the result is semi-predictable or too nonsensical, algocurios hits the sweet spot.
Visit algocurios and explore for yourself at webcurios.co.uk