[1] Gibiansky A. Bringing HPC techniques to deep learning [EB/OL].(2017-02-21)[2022-12-05]. https://andrew.gibiansky.com/blog/machine-learning/baidu-allreduce.
[2] Kim S, Yu G I, Park H, et al. Parallax: Sparsity-aware data parallel training of deep neural networks [C]//Proceedings of the Fourteenth EuroSys Conference. Dresden, MO, USA, 2019: 1-15. DOI: 10.1145/3302424.3303957.
[3] Kurth T, Treichler S, Romero J, et al. Exascale deep learning for climate analytics [C]//International Conference for High Performance Computing, Networking, Storage and Analysis. Dallas, TX, USA, 2018:649-660. DOI: 10.1109/SC.2018.00054.
[4] Lü G F, Li M F, An H, et al. Distributed deep learning system for cancerous region detection on Sunway TaihuLight [J].CCF Transactions on High Performance Computing, 2020, 2(4): 348-361. DOI: 10.1007/s42514-020-00046-5.
[5] Cui H, Radosavljevic V, Chou F C, et al. Multimodal trajectory predictions for autonomous driving using deep convolutional networks [C]//2019 International Conference on Robotics and Automation. Montreal, Canada, 2019: 2090-2096. DOI: 10.1109/ICRA.2019.8793868.
[6] Liang J, Makoviychuk V, Handa A, et al. Gpu-accelerated robotic simulation for distributed reinforcement learning [C]//Conference on Robot Learning. Zurich, Switzerland, 2018: 270-282. DOI: abs/1810.05762.
[7] Reisizadeh A, Prakash S, Pedarsani R, et al. Codedreduce: A fast and robust framework for gradient aggregation in distributed learning [J].IEEE/ACM Transactions on Networking, 2022, 30(1): 148-161. DOI: 10.1109/TNET.2021.3109097.
[8] Jia X, Song S, He W, et al. Highly scalable deep learning training system with mixed-precision: Training ImageNet in four minutes [EB/OL].(2018-07-30)[2022-09-05]. https://arxiv.org/abs/1807.11205.
[9] Mikami H, Suganuma H, Tanaka Y, et al. Massively distributed SGD: ImageNet/ResNet-50 training in a flash [EB/OL].(2019-03-05)[2022-09-05]. https://arxiv.org/abs/1811.05233.
[10] Zhang A, Chen J, Hu R Q, et al. SeDS: Secure data sharing strategy for D2D communication in LTE-Advanced networks [J].IEEE Transactions on Vehicular Technology, 2015, 65(4): 2659-2672. DOI: 10.1109/TVT.2015.2416002.
[11] Lopes A P G, Gondim P R L. Group authentication protocol based on aggregated signatures for D2D communication [J].Computer Networks, 2020, 178: 107192. DOI: 10.1016/j.comnet.2020.107192.
[12] Choi K Y, Hwang J Y, Lee D H. Efficient ID-based group key agreement with bilinear maps [C]//Proceedings of the 7th International Workshop on Public Key Cryptography. Singapore, 2004: 130-144. DOI: 10.1007/978-3-540-24632-9_10.
[13] Burmester M, Desmedt Y. A secure and efficient conference key distribution system [C]//EUROCRYPT’94. Perugia, Italy, 1994: 275-286. DOI: 10.1007/BFb0053443.
[14] Kim Y, Perrig A, Tsudik G. Tree-based group key agreement [J].ACM Transactions on Information and System Security, 2004, 7(1): 60-96. DOI: 10.1145/984334.984337.
[15] Mao Y, Sun Y, Wu M, et al. JET: Dynamic join-exit-tree amortization and scheduling for contributory key management [J].IEEE/ACM Transactions on Networking, 2006, 14(5): 1128-1140. DOI: 10.1109/TNET.2006.882851.
[16] Wang M, Yan Z. Privacy-preserving authentication and key agreement protocols for D2D group communications [J].IEEE Transactions on Industrial Informatics, 2017, 14(8): 3637-3647. DOI: 10.1109/TII.2017.2778090.
[17] Scott M. Efficient implementation of cryptographic pairings [EB/OL].(2016-02-08)[2022-12-05].http://www.pairing-conference.org/2007/invited/Scott_slide.pdf.