Paper
23 May 2023 The transfer method on the prompt for the summarization task
Xinhao Guo, Xiao Da
Author Affiliations +
Proceedings Volume 12604, International Conference on Computer Graphics, Artificial Intelligence, and Data Processing (ICCAID 2022); 126040L (2023) https://doi.org/10.1117/12.2674622
Event: 2nd International Conference on Computer Graphics, Artificial Intelligence, and Data Processing (ICCAID 2022), 2022, Guangzhou, China
Abstract
Prompt tuning is a parameter-efficient method that even surpasses traditional fine-tuning methods in few-shot scenarios. Nowadays, pre-trained language models are getting larger and larger, with more and more parameters, which makes the traditional fine-tuning method impractical to implement and consumes a lot of computing resources. Therefore, prompt-based methods have a broad application prospect. In the experiments, it is found that prefix tuning, a prompt-based method, has the problem of non-convergence or is quite slow to converge when the training samples are small. This paper proposes a cross-task parameter transfer method, which transfers the trained parameters from prompt tuning tasks to prefix tuning to improve the training speed and alleviate the problem of non-convergence or slow convergence in prefix tuning tasks.
© (2023) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Xinhao Guo and Xiao Da "The transfer method on the prompt for the summarization task", Proc. SPIE 12604, International Conference on Computer Graphics, Artificial Intelligence, and Data Processing (ICCAID 2022), 126040L (23 May 2023); https://doi.org/10.1117/12.2674622
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Artificial intelligence

Deep learning

Back to Top