TY - GEN
T1 - A Self-Supervised Method for Accelerated Training of Neural Communication Receivers
AU - Cooke, Corey D.
N1 - Publisher Copyright:
© 2024 IEEE.
PY - 2024
Y1 - 2024
N2 - Self-supervised learning (SSL), which is a branch of unsupervised learning, is a new machine learning paradigm for learning from large unlabeled datasets. In this paper we apply principles of SSL to the channel autoencoder problem from communications theory. We demonstrated this by first performing an SSL pre-training step using a contrastive loss, the training time of a neural receiver can be significantly reduced, even when the extra pre-training time has been accounted for. This approach could be used to improve the performance of neural receivers in a wide variety of channel conditions.
AB - Self-supervised learning (SSL), which is a branch of unsupervised learning, is a new machine learning paradigm for learning from large unlabeled datasets. In this paper we apply principles of SSL to the channel autoencoder problem from communications theory. We demonstrated this by first performing an SSL pre-training step using a contrastive loss, the training time of a neural receiver can be significantly reduced, even when the extra pre-training time has been accounted for. This approach could be used to improve the performance of neural receivers in a wide variety of channel conditions.
UR - http://www.scopus.com/inward/record.url?scp=85202450488&partnerID=8YFLogxK
U2 - 10.1109/ICMLCN59089.2024.10624771
DO - 10.1109/ICMLCN59089.2024.10624771
M3 - Conference contribution
AN - SCOPUS:85202450488
T3 - 2024 IEEE International Conference on Machine Learning for Communication and Networking, ICMLCN 2024
SP - 151
EP - 157
BT - 2024 IEEE International Conference on Machine Learning for Communication and Networking, ICMLCN 2024
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 1st IEEE International Conference on Machine Learning for Communication and Networking, ICMLCN 2024
Y2 - 5 May 2024 through 8 May 2024
ER -