Replacing Language Model for Style Transfer

11/14/2022
by   Pengyu Cheng, et al.
1

We introduce replacing language model (RLM), a sequence-to-sequence language modeling framework for text style transfer. Our method autoregressively replaces each token in the original sentence with a text span in the target style. In contrast, the new span is generated via a non-autoregressive masked language model. The RLM generation scheme gathers the flexibility of autoregressive models and the accuracy of non-autoregressive models, which bridges the gap between sentence-level and word-level style transfer methods. To further control the style of generated sentences, we conduct a style-content disentanglement on the hidden representations of RLM. Empirical results on real-world text style transfer tasks demonstrate the effectiveness of RLM compared with other baselines.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset