Method of adaptive content generation in mobile applications based on personalized deep learning models
Abstract
Keywords
Full Text:
PDF (Українська)References
Vaswani, A., Shazeer, N., Parmar, N., Uszkoreit, J., Jones, L., Gomez, A.N., Kaiser, Ł., & Polosukhin, I. Attention Is All You Need. Advances in Neural Information Processing Systems 30 (NIPS 2017), 2017, vol. 30, pp. 5998–6008.
Hinton, G., Vinyals, O., & Dean, J. Distilling the Knowledge in a Neural Network. NIPS Deep Learning and Representation Learning Workshop. 2015, Available at: http://arxiv.org/abs/1503.02531. (accessed 12.08.2025).
Jacob, B., Kligys, S., Chen, B., Zhu, M., Tang, M., Howard, A., Adam, H., & Kalenichenko, D. Quantization and Training of Neural Networks for Efficient Integer-Arithmetic-Only Inference. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2018, pp. 2704–2713. DOI: 10.1109/CVPR.2018.00286.
Han, S., Mao, H., & Dally, W.J. Deep compression: compressing deep neural networks with pruning, trained quantization and huffman coding. International Conference on Learning Representations (ICLR), 2016. Available at: https://arxiv.org/abs/1510.00149. (accessed 12.08.2025).
McMahan, H. B., Moore, E., Ramage, D., Hampson, S., & Arcas, B. A. Communication-Efficient Learning of Deep Networks from Decentralized Data. Proceedings of the 20th International Conference on Artificial Intelligence and Statistics (AISTATS), 2017, vol. 54, pp. 1273–1282.
Kairouz, P., McMahan, H.B., Avent, B., Bellet, A., Bennis, M., Bhagoji, A.N., et al. Advances and Open Problems in Federated Learning. Foundations and Trends® in Machine Learning, 2021, vol. 14, no. 1–2, pp. 1–210. DOI: 10.1561/2200000083.
Fallah, A., Mokhtari, A., & Ozdaglar, A. Personalized Federated Learning: A Meta-Learning Approach. Advances in Neural Information Processing Systems 33 (NeurIPS 2020), 2020, vol. 33, pp. 8676–8689.
Xia, F., & Cheng, W. A survey on privacy-preserving federated learning against poisoning attacks. Cluster Computing, 2024, vol. 27, pp. 13565–13582. DOI: 10.1007/s10586-024-04629-7.
Chen, X., Dong, D., Li, T., He, T., Zhou, L., Li, L., & Xie, X.. Deep reinforcement learning in recommender systems: A survey and new perspectives. Knowledge-Based Systems, 2023, vol. 264, art. no. 110335. DOI: 10.1016/j.knosys.2023.110335.
Sanh, V., Debut, L., Chaumond, J., & Wolf, T. DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter. NeurIPS Workshop on Energy Efficient Machine Learning and Cognitive Computing, 2019. Available at: https://arxiv.org/abs/1910.01108. (accessed 12.08.2025).
Jiao, X., Yin, Y., Shang, L., Jiang, X., Chen, X., Li, L., Wang, F., & Liu, Q. TinyBERT: Distilling BERT for Natural Language Understanding. Findings of the Association for Computational Linguistics: EMNLP 2020, 2020, pp. 4163–4174. DOI: 10.18653/v1/2020.findings-emnlp.372.
Bouneffouf, D., & Féraud, R. A Tutorial on Multi-Armed Bandit Applications for Large Language Models. Proceedings of the 30th ACM SIGKDD Conference on Knowledge Discovery and Data Mining (KDD ’24), 2024, pp. 6608–6609. DOI: 10.1145/3637528.3671408.
Lin, Y., Hefny, H., Coutinho, O., & Rosing, T. S. A Survey on Reinforcement Learning for Recommender Systems. IEEE Transactions on Neural Networks and Learning Systems, 2023, (Early Access). DOI: 10.1109/TNNLS.2023.3283282.
Heydari, S., & Mahmoud, Q. H. Tiny Machine Learning and On-Device Inference: A Survey of Applications, Challenges, and Future Directions. Sensors, 2023, vol. 23, iss. 10, art. no. 4600. DOI: 10.3390/s23104600.
DOI: https://doi.org/10.32620/aktt.2025.6.09
