The tiny poet: artificial poetry generation with a constrained GPT-2 model

Sergiu Stoia, L. Alfonso Ureña-López, Arturo Montejo-Ráez

Resumen


This paper presents a GPT-2 based constrained language model trained for poetry generation in Spanish. Our proposal applies constraints to the generated sequences to satisfy rhyme and meter, by means of a backtracking process in the text generation process. For its evaluation, a Turing test has been carried out on a sample of lay population, and an evaluation of several factors on a Likert scale by experts. Despite the relative simplicity of the GPT-2 model compared to current ones, the results obtained highlight the value of constraint-based generation systems as opposed to models with a larger number of parameters and which are far more expensive to train.

Texto completo:

PDF