Algorithm Research & Explore
|
799-804,810

XLNet-Transformer optimization method for Chinese-Malay low-resource neural machine translation based on deep coded attention

Zhan Siqia
Xu Zhizhana
Yang Weib
Xie Qianglaib
a. College of Information Engineering, b. Big Data Laboratory of Collaborative Innovation Center, Jiangxi University of Technology, Nanchang 330098, China

Abstract

Neural machine translation(NMT) has achieved remarkable results in applications in many fields, and it has fully demonstrated its superiority on large-scale corpora. However, there is still a huge room for improvement when there are insufficient corpus resources. The lack of a Chinese-Malay parallel corpus directly affects the translation effect of Chinese-Malay machine translation. In order to solve the problem of unsatisfactory Chinese-Malay low-resource machine translation, this paper proposed a low-resource neural machine translation method based on deep encoded attention and progressive unfreezing. Firstly, this method reconstructed the encoder using the XLNet pre-training model and replaced the output mode of the traditional encoding layer with the XLNet dynamic aggregation module in order to effectively compensate for the bottleneck caused by the lack of Chinese-Malay corpus. Secondly, it improved the traditional encoding-decoding attention by using a parallel crossattention module in the decoder, which enhanced the ability to capture the potential relationship between the source word and the target word. Finally, it adopted a progressive unfreezing training strategy to maximize the release of the model's perfor-mance. The experimental results demonstrate that the proposed method significantly improves the performance on a small-scale Chinese-Malay dataset, thus confirming its effectiveness. Compared with other low-resource NMT methods, this method had a simpler structure, and improved the encoder and decode, resulting in a more significant enhancement in the translation effect. The approach provides effective strategies and insights to cope with low-resource machine translation.

Foundation Support

江西省教育厅科学技术研究资助项目(GJJ2202613,GJJ212015)

Publish Information

DOI: 10.19734/j.issn.1001-3695.2023.08.0331
Publish at: Application Research of Computers Printed Article, Vol. 41, 2024 No. 3
Section: Algorithm Research & Explore
Pages: 799-804,810
Serial Number: 1001-3695(2024)03-022-0799-06

Publish History

[2023-10-12] Accepted Paper
[2024-03-05] Printed Article

Cite This Article

占思琦, 徐志展, 杨威, 等. 基于深度编码注意力的XLNet-Transformer汉-马低资源神经机器翻译优化方法 [J]. 计算机应用研究, 2024, 41 (3): 799-804,810. (Zhan Siqi, Xu Zhizhan, Yang Wei, et al. XLNet-Transformer optimization method for Chinese-Malay low-resource neural machine translation based on deep coded attention [J]. Application Research of Computers, 2024, 41 (3): 799-804,810. )

About the Journal

  • Application Research of Computers Monthly Journal
  • Journal ID ISSN 1001-3695
    CN  51-1196/TP

Application Research of Computers, founded in 1984, is an academic journal of computing technology sponsored by Sichuan Institute of Computer Sciences under the Science and Technology Department of Sichuan Province.

Aiming at the urgently needed cutting-edge technology in this discipline, Application Research of Computers reflects the mainstream technology, hot technology and the latest development trend of computer application research at home and abroad in a timely manner. The main contents of the journal include high-level academic papers in this discipline, the latest scientific research results and major application results. The contents of the columns involve new theories of computer discipline, basic computer theory, algorithm theory research, algorithm design and analysis, blockchain technology, system software and software engineering technology, pattern recognition and artificial intelligence, architecture, advanced computing, parallel processing, database technology, computer network and communication technology, information security technology, computer image graphics and its latest hot application technology.

Application Research of Computers has many high-level readers and authors, and its readers are mainly senior and middle-level researchers and engineers engaged in the field of computer science, as well as teachers and students majoring in computer science and related majors in colleges and universities. Over the years, the total citation frequency and Web download rate of Application Research of Computers have been ranked among the top of similar academic journals in this discipline, and the academic papers published are highly popular among the readers for their novelty, academics, foresight, orientation and practicality.


Indexed & Evaluation

  • The Second National Periodical Award 100 Key Journals
  • Double Effect Journal of China Journal Formation
  • the Core Journal of China (Peking University 2023 Edition)
  • the Core Journal for Science
  • Chinese Science Citation Database (CSCD) Source Journals
  • RCCSE Chinese Core Academic Journals
  • Journal of China Computer Federation
  • 2020-2022 The World Journal Clout Index (WJCI) Report of Scientific and Technological Periodicals
  • Full-text Source Journal of China Science and Technology Periodicals Database
  • Source Journal of China Academic Journals Comprehensive Evaluation Database
  • Source Journals of China Academic Journals (CD-ROM Version), China Journal Network
  • 2017-2019 China Outstanding Academic Journals with International Influence (Natural Science and Engineering Technology)
  • Source Journal of Top Academic Papers (F5000) Program of China's Excellent Science and Technology Journals
  • Source Journal of China Engineering Technology Electronic Information Network and Electronic Technology Literature Database
  • Source Journal of British Science Digest (INSPEC)
  • Japan Science and Technology Agency (JST) Source Journal
  • Russian Journal of Abstracts (AJ, VINITI) Source Journals
  • Full-text Journal of EBSCO, USA
  • Cambridge Scientific Abstracts (Natural Sciences) (CSA(NS)) core journals
  • Poland Copernicus Index (IC)
  • Ulrichsweb (USA)