KDD 2023 Tutorial:
Fast Text Generation with Text-Editing Models

1Google, 2University of California, Riverside

Tuesday August 8, 10 am - 1 pm (PDT) @ 202C - Long Beach Convention & Entertainment Center

About this tutorial

Text-editing models have recently become a prominent alternative to seq2seq models for monolingual text-generation tasks such as grammatical error correction, text simplification, and style transfer. These tasks share a common trait – they exhibit a large amount of textual overlap between the source and target texts.

Text-editing models take advantage of this observation and learn to generate the output by predicting edit operations applied to the source sequence. In contrast, seq2seq models generate outputs word-by-word from scratch thus making them slow at inference time. Text-editing models provide several benefits over seq2seq models including faster inference speed, higher sample efficiency, and better control and interpretability of the outputs.

This tutorial provides a comprehensive overview of the text-edit based models and current state-of-the-art approaches analyzing their pros and cons. We discuss challenges related to deployment and how these models help to mitigate hallucination and bias, both pressing challenges in the field of text generation.

Schedule

Our tutorial will be held on August 8, 10 am - 1 pm (PDT). Slides may be subject to updates.

Time Section Presenter
10:00—10:15 Section 1: Introduction - What are text-editing models? [Slides] Eric
10:15—10:50 Section 2: Model Design [Slides] Eric, Jonathan
10:50—11:25 Section 3: Applications [Slides] Yue
11:25—11:30 Break
11:30—11:45 Section 4: Controllable Generation [Slides] Yue
11:45—11:55 Section 5: Multilingual Text Editing [Slides] Eric
11:55—12:25 Section 6: Faster (Large) Language Models [Slides] Jonathan
12:25—12:30 Section 7: Recommendations & Future Directions [Slides] Eric
12:30—13:00 Q & A Session

Reading List

Prerequisites


Text-Editing Methods Discussed during the Tutorial

  • EdiT5, Mallinson et al., 2022.
  • EditNTS, Dong et al., 2019.
  • Felix, Mallinson et al., 2020.
  • GECToR, Omelianchuk et al., 2020.
  • HCT, Jin et al., 2022.
  • LaserTagger, Malmi et al., 2019.
  • LevT, Gu et al., 2019.
  • LEWIS, Reid and Zhong, 2021.
  • Masker, Malmi et al., 2020.
  • PIE, Awasthi et al., 2019.
  • RUN, Liu et al., 2020.
  • Seq2Edits, Stahlberg and Kumar, 2020.
  • SL, Alva-Machego et al., 2017.

Acknowledgements

    We would like to thank Cesar Ilharco.

BibTeX

@inproceedings{kdd2023-text-editing-tutorial,
author = {Malmi, Eric and Dong, Yue and Mallinson, Jonathan and Chuklin, Aleksandr and Adamek, Jakub and Mirylenka, Daniil and Stahlberg, Felix and Krause, Sebastian and Kumar, Shankar and Severyn, Aliaksei},
title = {Fast Text Generation with Text-Editing Models},
year = {2023},
isbn = {9798400701030},
publisher = {Association for Computing Machinery},
address = {New York, NY, USA},
url = {https://doi.org/10.1145/3580305.3599579},
doi = {10.1145/3580305.3599579},
booktitle = {Proceedings of the 29th ACM SIGKDD Conference on Knowledge Discovery and Data Mining},
pages = {5815–5816},
location = {Long Beach, CA, USA},
series = {KDD '23}
}