|Table of Contents|

[1] Fang Yunan, Jiang Rui,. Gradient parameter data integrity protection schemein the Ring Allreduce architecture [J]. Journal of Southeast University (English Edition), 2023, 39 (1): 81-88. [doi:10.3969/j.issn.1003-7985.2023.01.010]

Gradient parameter data integrity protection schemein the Ring Allreduce architecture()

Journal of Southeast University (English Edition)[ISSN:1003-7985/CN:32-1325/N]

2023 1
Research Field:
Computer Science and Engineering
Publishing date:


Gradient parameter data integrity protection schemein the Ring Allreduce architecture
Fang Yunan Jiang Rui
School of Cyber Science and Engineering, Southeast University, Nanjing 210096, China
distributed machine learning data integrity group key agreement Ring Allreduce architecture
As there is no research on protecting gradient parameter data integrity in the Ring Allreduce architecture, a Ring Allreduce architecture oriented gradient parameter data integrity protection scheme(RAA-DIP)is proposed. The identity-based group key agreement algorithm and the Boneh-Lynn-Shacham signature are used to protect the integrity of gradient parameter data in Ring Allreduce(RAA). Combined with identity authentication and the key negotiation algorithm, secure and efficient dynamic management of working nodes is realized. On the basis of the decisional bilinear Diffie-Hellman problem, secure group key negotiation is implemented so that the key generation center or network attackers cannot calculate the shared secret of worker nodes, which solves the key escrow problem and ensures the integrity of transmission gradient data. Finally, the RAA-DIP scheme is formally proved, and its simulation performance is compared with those of related schemes. The results show that the RAA-DIP scheme can guarantee the integrity of the gradient parameter data transmission process in Ring Allreduce, realize the dynamic management of working nodes, and solve the problem of key escrow. Compared with related schemes, it can meet security and performance requirements.


[1] Gibiansky A. Bringing HPC techniques to deep learning [EB/OL].(2017-02-21)[2022-12-05]. https://andrew.gibiansky.com/blog/machine-learning/baidu-allreduce.
[2] Kim S, Yu G I, Park H, et al. Parallax: Sparsity-aware data parallel training of deep neural networks [C]//Proceedings of the Fourteenth EuroSys Conference. Dresden, MO, USA, 2019: 1-15. DOI: 10.1145/3302424.3303957.
[3] Kurth T, Treichler S, Romero J, et al. Exascale deep learning for climate analytics [C]//International Conference for High Performance Computing, Networking, Storage and Analysis. Dallas, TX, USA, 2018:649-660. DOI: 10.1109/SC.2018.00054.
[4] Lü G F, Li M F, An H, et al. Distributed deep learning system for cancerous region detection on Sunway TaihuLight [J].CCF Transactions on High Performance Computing, 2020, 2(4): 348-361. DOI: 10.1007/s42514-020-00046-5.
[5] Cui H, Radosavljevic V, Chou F C, et al. Multimodal trajectory predictions for autonomous driving using deep convolutional networks [C]//2019 International Conference on Robotics and Automation. Montreal, Canada, 2019: 2090-2096. DOI: 10.1109/ICRA.2019.8793868.
[6] Liang J, Makoviychuk V, Handa A, et al. Gpu-accelerated robotic simulation for distributed reinforcement learning [C]//Conference on Robot Learning. Zurich, Switzerland, 2018: 270-282. DOI: abs/1810.05762.
[7] Reisizadeh A, Prakash S, Pedarsani R, et al. Codedreduce: A fast and robust framework for gradient aggregation in distributed learning [J].IEEE/ACM Transactions on Networking, 2022, 30(1): 148-161. DOI: 10.1109/TNET.2021.3109097.
   [8] Jia X, Song S, He W, et al. Highly scalable deep learning training system with mixed-precision: Training ImageNet in four minutes [EB/OL].(2018-07-30)[2022-09-05]. https://arxiv.org/abs/1807.11205.
   [9] Mikami H, Suganuma H, Tanaka Y, et al. Massively distributed SGD: ImageNet/ResNet-50 training in a flash [EB/OL].(2019-03-05)[2022-09-05]. https://arxiv.org/abs/1811.05233.
[10] Zhang A, Chen J, Hu R Q, et al. SeDS: Secure data sharing strategy for D2D communication in LTE-Advanced networks [J].IEEE Transactions on Vehicular Technology, 2015, 65(4): 2659-2672. DOI: 10.1109/TVT.2015.2416002.
[11] Lopes A P G, Gondim P R L. Group authentication protocol based on aggregated signatures for D2D communication [J].Computer Networks, 2020, 178: 107192. DOI: 10.1016/j.comnet.2020.107192.
[12] Choi K Y, Hwang J Y, Lee D H. Efficient ID-based group key agreement with bilinear maps [C]//Proceedings of the 7th International Workshop on Public Key Cryptography. Singapore, 2004: 130-144. DOI: 10.1007/978-3-540-24632-9_10.
[13] Burmester M, Desmedt Y. A secure and efficient conference key distribution system [C]//EUROCRYPT’94. Perugia, Italy, 1994: 275-286. DOI: 10.1007/BFb0053443.
[14] Kim Y, Perrig A, Tsudik G. Tree-based group key agreement [J].ACM Transactions on Information and System Security, 2004, 7(1): 60-96. DOI: 10.1145/984334.984337.
[15] Mao Y, Sun Y, Wu M, et al. JET: Dynamic join-exit-tree amortization and scheduling for contributory key management [J].IEEE/ACM Transactions on Networking, 2006, 14(5): 1128-1140. DOI: 10.1109/TNET.2006.882851.
[16] Wang M, Yan Z. Privacy-preserving authentication and key agreement protocols for D2D group communications [J].IEEE Transactions on Industrial Informatics, 2017, 14(8): 3637-3647. DOI: 10.1109/TII.2017.2778090.
[17] Scott M. Efficient implementation of cryptographic pairings [EB/OL].(2016-02-08)[2022-12-05].http://www.pairing-conference.org/2007/invited/Scott_slide.pdf.


Biographies: Fang Yunan(1998—), female, graduate; Jiang Rui(corresponding author), male, doctor, professor, R.Jiang@seu.edu.cn.
Foundation items: The National Natural Science Foundation of China(No. 61372103), the Natural Science Foundation of Jiangsu Province(No. BK20201265), Foundation of the National Engineering Research Center of Classified Protection and Safeguard Technology for Cybersecurity(No. C21640-2).
Citation: Fang Yunan, Jiang Rui. Gradient parameter data integrity protection scheme in the Ring Allreduce architecture[J].Journal of Southeast University(English Edition), 2023, 39(1):81-88.DOI:10.3969/j.issn.1003-7985.2023.01.010.
Last Update: 2023-03-20