Adaptive Cyclic Learning Rate untuk Prediksi Deret Waktu Jangka Panjang pada Model Deep Learning Berarsitektur Encoder-Decoder dan Encoder-Only

Mahhisa, Farrela Ranku (2025) Adaptive Cyclic Learning Rate untuk Prediksi Deret Waktu Jangka Panjang pada Model Deep Learning Berarsitektur Encoder-Decoder dan Encoder-Only. Other thesis, Institut Teknologi Sepuluh Nopember.

[thumbnail of 5025211129-Undergraduate_Thesis.pdf] Text
5025211129-Undergraduate_Thesis.pdf
Restricted to Repository staff only

Download (10MB) | Request a copy

Abstract

Prediksi deret waktu jangka panjang merupakan tantangan penting dalam berbagai bidang, terutama ketika data memiliki pola kompleks dan horizon prediksi yang luas. Dalam ranah deep learning, arsitektur encoder-decoder dan encoder-only sering digunakan untuk tugas ini, meskipun masih menghadapi kendala dalam hal stabilitas dan efisiensi pelatihan. Penelitian ini mengusulkan penerapan Adaptive Cyclic Learning Rate (ACLR), yaitu algoritma penyesuaian learning rate secara dinamis dan siklik berdasarkan kondisi pelatihan, untuk meningkatkan performa dan konvergensi model deep learning dalam memprediksi deret waktu jangka panjang. ACLR diterapkan pada lima model representatif, yaitu Transformer dan Informer (encoder-decoder), serta iTransformer, FRNet, dan PatchTST (encoder-only). Evaluasi dilakukan menggunakan tiga dataset publik dan empat skenario panjang prediksi. Hasil evaluasi menunjukkan bahwa ACLR memberikan penurunan MSE hingga 5,18% dan MAE hingga 4,42% pada model encoder-only. Sebaliknya, pada model encoder-decoder, ACLR justru menyebabkan peningkatan MSE hingga 61,56% dan MAE hingga 28,75%. Temuan ini diperkuat oleh hasil uji paired t-test yang menghasilkan nilai p-value signifikan pada sebagian besar skenario pengujian. Penelitian ini menunjukkan bahwa efektivitas ACLR bergantung pada jenis arsitektur model, dan pendekatan adaptif ini lebih sesuai untuk arsitektur encoder-only dalam konteks prediksi deret waktu jangka panjang.
==================================================================================================================================
Long-term time series forecasting presents a significant challenge across various domains, especially when the data exhibit complex patterns and extended prediction horizons. In the deep learning domain, encoder-decoder and encoder-only architectures are commonly used for this task, although both still face limitations in terms of training stability and efficiency. This study proposes the application of Adaptive Cyclic Learning Rate (ACLR), an algorithm that dynamically and cyclically adjusts the learning rate based on training conditions, to improve the performance and convergence of deep learning models in long-term forecasting tasks. ACLR is implemented on five representative models: Transformer and Informer (encoder-decoder), as well as iTransformer, FRNet, and PatchTST (encoder-only). The evaluation was conducted using three public datasets and four prediction-length scenarios. Results show that ACLR reduced MSE by up to 5.18% and MAE by up to 4.42% on encoder-only models. In contrast, ACLR led to increased MSE by up to 61.56% and MAE by up to 28.75% on encoder-decoder models. These findings are further supported by paired t-test results, which yielded statistically significant p-values in most test scenarios. This study demonstrates that the effectiveness of ACLR is dependent on model architecture, and that this adaptive approach is more suitable for encoder-only models in the context of long-term time series forecasting.

Item Type: Thesis (Other)
Uncontrolled Keywords: Adaptive learning rate, deep learning, deret waktu, evaluasi model, prediksi jangka panjang, long-term forecasting, model evaluation, time series.
Subjects: Q Science > QA Mathematics > QA76.87 Neural networks (Computer Science)
Divisions: Faculty of Intelligent Electrical and Informatics Technology (ELECTICS) > Informatics Engineering > 55201-(S1) Undergraduate Thesis
Depositing User: Farrela Ranku Mahhisa
Date Deposited: 04 Jun 2025 07:19
Last Modified: 04 Jun 2025 07:19
URI: http://repository.its.ac.id/id/eprint/119135

Actions (login required)

View Item View Item