Email or username:

Password:

Forgot your password?
24 posts total
allison

new work from me in the ISEA 2020 Art Programme: Reconstructions reconstructions.decontextualiz an infinite computer-generated poem with a nested chiastic structure

the lines are generated by sampling a variational autoencoder (github.com/aparrish/vae-laggin) trained on a bunch of public domain poetry (github.com/aparrish/gutenberg-). each line is paired with another resulting from the model's reconstruction of the original line with words reversed

allison

here's a recording of big ol' lecture I wrote & delivered for BMOLab and Vector Institute last month, entitled "Language Models and Poetics," in which I claim that computers not only *can* generate poetry, but in fact they can *only* generate poetry.

bmolab.artsci.utoronto.ca/?p=8

the discussion touches on GPT-n (of course), speech act theory, William Carlos Williams' _Spring and All_, Frank Lantz (twice), and more 🎶

(the audio didn't come out great, happy to supply text/slides to interested folks)

here's a recording of big ol' lecture I wrote & delivered for BMOLab and Vector Institute last month, entitled "Language Models and Poetics," in which I claim that computers not only *can* generate poetry, but in fact they can *only* generate poetry.

bmolab.artsci.utoronto.ca/?p=8

the discussion touches on GPT-n (of course), speech act theory, William Carlos Williams' _Spring and All_, Frank Lantz (twice), and more 🎶

allison

oh hey a deep learning constituency parser! I thought everyone had just moved to dependency parsing forever demo.allennlp.org/constituency

allison

(I'm interested in constituency parsing for poetic purposes bc I think it maps more neatly than dependency parsing to people's internal mental models of how syntax works)

allison

magnifique, chef's kiss, a single tear runs down my cheek

Go Up