Journal of Applied Science and Engineering

Published by Tamkang University Press

1.30

Impact Factor

2.10

CiteScore

Qian ChenThis email address is being protected from spambots. You need JavaScript enabled to view it.

School of Foreign Languages, Zhengzhou University of Science and Technology, Zhengzhou China


 

Received: November 11, 2025
Accepted: November 24, 2025
Publication Date: January 19, 2026

 Copyright The Author(s). This is an open access article distributed under the terms of the Creative Commons Attribution License (CC BY 4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are cited.


Download Citation: ||https://doi.org/10.6180/jase.202607_30.035  


This paper presents a Chinese-English neural machine-translation model that hybridizes self-attention with a lightweight gating module to better capture the distinct morphological and syntactic characteristics of the two languages. Conventional transformers treat source and target sequences homogeneously, ignoring the fact that Chinese relies on analytical structure while English is more morphologically marked. We therefore split the encoder into two parallel branches: a multi-head self-attention stack that learns long-range dependencies and a convolution-guided gate that dynamically emphasizes character-level features such as boundary and sub-word information. The two branches interact through a learnable fusion gate whose parameters are updated by the overall translation loss, allowing the network to softly switch between global and local views at each layer. A mirrored gating sub-layer is further inserted into the decoder to prevent the model from over-attending to either language-specific signals. Experiments on multiple Chinese-English corpora show that the proposed architecture substantially improves translation adequacy and fluency without extra parameters or latency, confirming the effectiveness of integrating language-aware inductive bias into mainstream attention-based frameworks.


Keywords: Chinese-English neural machine-translation, Lightweight gating module, Conventional transformer, Hybrid self-attention mechanism


  1. [1] S. H. Ahammad, R. R. Kalangi, S. Nagendram, S. Inthiyaz, P. P. Priya, O. S. Faragallah, A. Mohammad, M.M.Eid,andA.N.Z.Rashed, (2024)“Improved neural machine translation using Natural Language Processing(NLP) " Multimedia Tools and Applications 83(13):39335–39348.DOI:10.1007/s11042-023-17207-7.
  2. [2] S. Bala Das, D. Panda, T. Kumar Mishra, B. Kr. Patra,andA.Ekbal,(2024) “Multilingual neural machine translation for indic to indiclanguages "ACM Transactions on Asian and Low-Resource Language Information Processing23(5):1–32.DOI:10.1145/3652026.
  3. [3] S. Yin, H. Li, L. Teng, A. A. Laghari, A. Almadhor, M. Gregus, and G. A. Sampedro, (2024) “Brain CT image classification based on mask RCNN and attention mechanism" Scientific Reports 14(1): 29300. DOI: 10.1038/s41598-024-78566-1.
  4. [4] R. Cao, Z. Ma, and J. Liu, (2024) “Enhancing Traffic Flow Prediction In The Presence Of Missing Data Through Spatio-Temporal Causal Graphs" Journal of Applied Science and Engineering 28(2): 237–246. DOI: 10.6180/jase.202502_28(2).0003.
  5. [5] S. Yin, H. Li, A. A. Laghari, T. R. Gadekallu, G. A. Sampedro, and A. Almadhor, (2024) “An anomaly detection model based on deep auto-encoder and capsule graph convolution via sparrow search algorithm in 6G In ternet of Everything" IEEE Internet of Things Journal 11(18): 29402–29411. DOI: 10.1109/JIOT.2024.3353337.
  6. [6] D.YaoandY.Shao, (2024) “A data efficient transformer based on Swin Transformer" The Visual Computer 40(4): 2589–2598. DOI: 10.1007/s00371-023-02939-2.
  7. [7] J.Shi,R.-X.Zhao,W.Wang,S.Zhang,andX.Li,(2024) “QSAN: Anear-term achievable quantum self-attention network" IEEE Transactions on Neural Networks and Learning Systems: DOI: 10.1109/TNNLS.2024.3504828.
  8. [8] S. Yin and A. A. Laghari, (2024) “Multi-branch Collaboration Based Person Re-identification" IFS/ACM Transactions on Machine Learning 1(1): 19–24. DOI: 10.70891/JSE.2024.100013.
  9. [9] X. Yang, X. Dong, J. Yu, S. Li, X. Xiong, and H. Shen, (2025) “Locally enhanced denoising self-attention net works and decoupled position encoding for sequential recommendation" Computers and Electrical Engineering 123: 110064.
  10. [10] M. M. Biltawi, A. Awajan, and S. Tedmori, (2025) “Improved bidirectional attention flow (BIDAF) model for Arabic machine reading comprehension" Natural Language Processing 31(3): 771–799. DOI: 10.1017/nlp.2024.46.
  11. [11] Z. Wang and Y. Wang, (2024) “Digital library book recommendation system based on tag mining" Journal of Artificial Intelligence Research 1(1): 10–16. DOI: 10.70891/JAIR.2024.100022.
  12. [12] L. Chen, L. Gao, Y. Jiang, Y. Li, G. He, and J. Ning, (2024) “Local-global self-attention for transformer-based object tracking" IEEE Transactions on Circuits and Systems for Video Technology 34(12): 12316–12329. DOI: 10.1109/TCSVT.2024.3434949.
  13. [13] C. Sun, S. Shen, W. Tao, D. Xue, and Z. Zhou. “Noise Resilient Symbolic Regression with Dynamic Gating Reinforcement Learning”. In: Proceedings of the AAAI Conference on Artificial Intelligence. 39. 19. 2025, 20690 20698. DOI: 10.1609/aaai.v39i19.34280.
  14. [14] H. Zhang, Z. Hu, D. Yu, L. Guan, X. Liu, and C. Ma, (2024) “Multipath attention and adaptive gating network for video action recognition" Neural Processing Letters 56(2): 124. DOI: 10.1007/s11063-024-11591-3.
  15. [15] F. Li, H. Shen, J. Mai, T. Wang, Y. Dai, and X. Miao, (2024) “Pre-trained language model-enhanced conditional generative adversarial networks for intrusion detection" Peer-to-Peer Networking and Applications 17(1): 227–245. DOI: 10.1007/s12083-023-01595-6.
  16. [16] R. Liu, Y. Wang, Z. Guo, H. Xu, Z. Qin, W. Ma, and F. Zhang, (2024) “TransURL: Improving malicious URL detection with multi-layer Transformer encoding and multi-scale pyramid features" Computer Networks 253: 110707. DOI: 10.1016/j.comnet.2024.110707.
  17. [17] R. Mulyana and D. S. Maylawati, (2024) “Exploratory data analysis to reveal learning loss condition in Islamic religious education" International Journal of Evaluation and Research in Education (IJERE) 13(1): 43. DOI: 10.11591/ijere.v13i1.26344.
  18. [18] Y. Sun, B. Li, L. Li, P. Yang, and D. Yang, (2025) “Shared-weight multimodal translation model for recognizing Chinese variant characters" Frontiers of Information Technology & Electronic Engineering 26(7): 1066–1082. DOI: 10.1631/FITEE.2400504.
  19. [19] R. D. Lins, H. Oliveira, and S. J. Simske. “Assessing the Reliability and Validity of the Measures for Automatic Text Summarization”. In: Proceedings of the ACM Symposium on Document Engineering 2024. 2024, 1–4. DOI: 10.1145/3685650.368567.
  20. [20] M. Yang, (2024) “ZeUS: An Unified Training Frame work for Constrained Neural Machine Translation" IEEE Access:
  21. [21] P. Huang, Y. Mu, Y. Wu, B. Li, C. Xiao, T. Xiao, and J. Zhu, (2024) “Translate-and-Revise: Boosting Large Language Models for Constrained Translation" IEEE Access: 120–139.
  22. [22] P. Vo and L. Nguyen, (2024) “Knowledge-Based Trans former: Enhancing Neural Machine Translation Through Knowledge Base Integration" IEEE Access: 4334–4339. DOI: 10.1109/SMC54092.2024.10831374.


    



 

2.1
2023CiteScore
 
 
69th percentile
Powered by  Scopus

SCImago Journal & Country Rank

Enter your name and email below to receive latest published articles in Journal of Applied Science and Engineering.