A Self-Supervised Method for Accelerated Training of Neural Communication Receivers

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Self-supervised learning (SSL), which is a branch of unsupervised learning, is a new machine learning paradigm for learning from large unlabeled datasets. In this paper we apply principles of SSL to the channel autoencoder problem from communications theory. We demonstrated this by first performing an SSL pre-training step using a contrastive loss, the training time of a neural receiver can be significantly reduced, even when the extra pre-training time has been accounted for. This approach could be used to improve the performance of neural receivers in a wide variety of channel conditions.

Original languageEnglish
Title of host publication2024 IEEE International Conference on Machine Learning for Communication and Networking, ICMLCN 2024
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages151-157
Number of pages7
ISBN (Electronic)9798350343199
DOIs
StatePublished - 2024
Event1st IEEE International Conference on Machine Learning for Communication and Networking, ICMLCN 2024 - Stockholm, Sweden
Duration: May 5 2024May 8 2024

Publication series

Name2024 IEEE International Conference on Machine Learning for Communication and Networking, ICMLCN 2024

Conference

Conference1st IEEE International Conference on Machine Learning for Communication and Networking, ICMLCN 2024
Country/TerritorySweden
CityStockholm
Period05/5/2405/8/24

Fingerprint

Dive into the research topics of 'A Self-Supervised Method for Accelerated Training of Neural Communication Receivers'. Together they form a unique fingerprint.

Cite this