Turkish Data-To Generation Using Sequence-To Neural Networks

No Thumbnail Available

Date

2023

Authors

Demir, Şeniz

Journal Title

Journal ISSN

Volume Title

Publisher

Assoc Computing Machinery

Abstract

End-to-end data-driven approaches lead to rapid development of language generation and dialogue systems. Despite the need for large amounts of well-organized data, these approaches jointly learn multiple components of the traditional generation pipeline without requiring costly human intervention. End-to-end approaches also enable the use of loosely aligned parallel datasets in system development by relaxing the degree of semantic correspondences between training data representations and text spans. However, their potential in Turkish language generation has not yet been fully exploited. In this work, we apply sequenceto-sequence (Seq2Seq) neural models to Turkish data-to-text generation where the input data given in the form of a meaning representation is verbalized. We explore encoder-decoder architectures with attention mechanism in unidirectional, bidirectional, and stacked recurrent neural network (RNN) models. Our models generate one-sentence biographies and dining venue descriptions using a crowdsourced dataset where all field value pairs that appear in meaning representations are fully captured in reference sentences. To support this work, we also explore the performances of our models on a more challenging dataset, where the content of a meaning representation is too large to fit into a single sentence, and hence content selection and surface realization need to be learned jointly. This dataset is retrieved by coupling introductory sentences of person-related Turkish Wikipedia articles with their contained infobox tables. Our empirical experiments on both datasets demonstrate that Seq2Seq models are capable of generating coherent and fluent biographies and venue descriptions from field value pairs. We argue that the wealth of knowledge residing in our datasets and the insights obtained fromthis study hold the potential to give rise to the development of new end-to-end generation approaches for Turkish and other morphologically rich languages.

Description

TUBITAK-ARDEB [117E977]
This work is supported by TUBITAK-ARDEB under the grant number 117E977.

Keywords

Of-the-art, Sequence-to-sequence model, Turkish, Wikipedia, Natural-language generation, Data-to-text generation

Turkish CoHE Thesis Center URL

Citation

Demir, S. (2023). Turkish Data-to-Text Generation Using Sequence-to-Sequence Neural Networks. ACM Transactions on Asian and Low-Resource Language Information Processing, 22(2), 1-27.

WoS Q

Q4

Scopus Q

Q2

Source

Volume

22

Issue

2

Start Page

End Page