Information × Registration Number 2120U007892, Article popup.category Препринт Title Enhancing controllability of text generation (AI translated) popup.author Shcherbyna AntonShcherbyna Anton popup.publication 01-01-2020 popup.source_user Український католицький університет popup.source https://hdl.handle.net/20.500.14570/2240 popup.publisher Description Many models could generate text conditioned on some context, but those approaches don’t provide us with the ability to control various aspects of the generated text (e.g., sentiment). To address this problem, Variational Autoencoder is typically used because they give the ability to manipulate in latent space and, in this way,control text generation. However, it has been shown that VAE with strong autoregressive decoders,which are used for text modeling, faces posterior collapse problem. We think that one of the reasons why this problem occurs is a restrictive gaussian assumption we made about approximate posterior. In this work, we want to apply well-known approaches based on Normalizing Flows to improve approximate posterior for text modeling and check if it can help avoid posterior collapse. popup.nrat_date 2025-11-05 Close
Shcherbyna Anton. Enhancing controllability of text generation (AI translated)
:
published. 2020-01-01;
Український католицький університет, 2120U007892