Location
Online
Event Website
https://hicss.hawaii.edu/
Start Date
3-1-2023 12:00 AM
End Date
7-1-2023 12:00 AM
Description
Abstract dialogue summarization generation has recently attracted considerable research attention, especially in using hierarchical models to accomplish abstract dialogue summarization tasks successfully. However, problems in recent studies often include an excessive amount of model parameters and long training time mainly because existing dialogue summaries of hierarchical models are typically generated by adding extra encoders and attention layers in the decoder to enhance learning and summarization generation ability of the model. Hence, designing an increasingly lightweight hierarchical model is necessary. A lightweight hierarchical model named ALH-BART is proposed in this study to generate high-accuracy dialogue summaries rapidly. The proposed hierarchical model includes word and turn encoders, which enhance the ability of the model to understand dialogue. A multigranularity decoder in the model is also proposed to decode word- and turn-level information in the decoder at the same time. Encoder parameters in multihead self-attention are provided for each corresponding multihead self-attention to reduce the number of model parameters and improve the speed of model learning effectively. Finally, the effectiveness of the model is verified on SAMSum and DialogSum datasets.
Recommended Citation
Zheng, Tong and Saga, Ryosuke, "A Lite Hierarchical Model for Dialogue Summarization with Multi-Granularity Decoder" (2023). Hawaii International Conference on System Sciences 2023 (HICSS-56). 9.
https://aisel.aisnet.org/hicss-56/dsm/data_analytics/9
A Lite Hierarchical Model for Dialogue Summarization with Multi-Granularity Decoder
Online
Abstract dialogue summarization generation has recently attracted considerable research attention, especially in using hierarchical models to accomplish abstract dialogue summarization tasks successfully. However, problems in recent studies often include an excessive amount of model parameters and long training time mainly because existing dialogue summaries of hierarchical models are typically generated by adding extra encoders and attention layers in the decoder to enhance learning and summarization generation ability of the model. Hence, designing an increasingly lightweight hierarchical model is necessary. A lightweight hierarchical model named ALH-BART is proposed in this study to generate high-accuracy dialogue summaries rapidly. The proposed hierarchical model includes word and turn encoders, which enhance the ability of the model to understand dialogue. A multigranularity decoder in the model is also proposed to decode word- and turn-level information in the decoder at the same time. Encoder parameters in multihead self-attention are provided for each corresponding multihead self-attention to reduce the number of model parameters and improve the speed of model learning effectively. Finally, the effectiveness of the model is verified on SAMSum and DialogSum datasets.
https://aisel.aisnet.org/hicss-56/dsm/data_analytics/9