Keberhasilan Dialog Menggunakan Multi-Task Learning Task-Oriented Dialogue Berdasarkan Bidirectional Auto-Regressive Transformers

Putriansyah, Tasya Fathia (2023) Keberhasilan Dialog Menggunakan Multi-Task Learning Task-Oriented Dialogue Berdasarkan Bidirectional Auto-Regressive Transformers. Other thesis, Institut Teknologi Sepuluh Nopember.

[thumbnail of 06111940000072-Undergraduate_Thesis.pdf] Text
06111940000072-Undergraduate_Thesis.pdf - Accepted Version
Restricted to Repository staff only until 1 October 2025.

Download (12MB) | Request a copy

Abstract

Keberhasilan dialog (percakapan) tidak hanya ditentukan pada sebuah topik permasalahan pada aplikasi nyata, akan tetapi beberapa topik yang menjadi bahasan. Misalkan, dialog perencanaan wisata ditentukan oleh beberapa tugas seperti pembelian tiket transportasi, sewa hotel, dan lain sebagainya. Sehingga, multi-task menjadi faktor utama pada sistem chatbot. Adapun model yang dikembangkan untuk multi task pada suatu dialog yang dinamakan dengan multi-task learning task-oriented dialogue atau MTTOD. Model MTTOD didasarkan pada arsitektur pre-trained T5 yang masih memiliki kekurangan dikarenakan tidak mempertimbangkan konteks dari suatu kalimat. Sehingga, penelitian Tugas Akhir ini mengusulkan model MTTOD menggunakan arsitektur pre-trained Bidirectional Auto-Regressive Transformers (BART) untuk menangkap konteks dari suatu percakapan. Tujuan dari penelitian Tugas Akhir ini untuk meningkatkan performansi keberhasilan dialog. Model yang diusulkan diuji pada publik dataset Multi-domain Wizard-of-Oz atau MultiWOZ. Hasil evaluasi menunjukkan bahwa model MTTOD-BART memberikan hasil yang masih lebih rendah dibandingkan model baseline pada masing-masing evaluasi metrik dikarenakan kurang mampu menangkap konteks percakapan dalam suatu dialog.
====================================================================================================================================
Dialogue succession works not only on a single domain but also on multiple domains in real-world applications. For instance, in dialogue on holiday planning, we clearly ask different options on ticket fare, booking hotel, etc. Hence, multi-task learning is one important approach to chatbot systems. One of the multi-task learning models in the chatbot system is multi-task learning task-oriented dialogue (MTTOD). Basically, the MTTOD model is constructed from a pre-trained T5 model, which has a drawback by neglecting contextual of sentences. In this study, we propose MTTOD model by using a pre-trained Bidirectional Autoregressive Transformer (BART) to capture the contextual of utterance. We aim to improve the dialogue succession task. The proposed model will be evaluated on a public dataset Multi-domain Wizard-of-Oz (MultiWOZ). The experiment results showed that MTTOD BART provide lower result compared to the baseline model for each metric since it cannot capture the dialog context well.

Item Type: Thesis (Other)
Uncontrolled Keywords: Chatbot, Keberhasilan Dialog, Task-Oriented Dialogue, Dialogue Succession, Task-Oriented Dialogue, BART
Subjects: Q Science > QA Mathematics > QA336 Artificial Intelligence
T Technology > TK Electrical engineering. Electronics Nuclear engineering > TK6565.T7 Transformers
Divisions: Faculty of Science and Data Analytics (SCIENTICS) > Mathematics > 44201-(S1) Undergraduate Thesis
Depositing User: Tasya Fathia Putriansyah
Date Deposited: 05 Aug 2023 03:41
Last Modified: 05 Aug 2023 03:41
URI: http://repository.its.ac.id/id/eprint/103728

Actions (login required)

View Item View Item