Samenvatting

Dynamic Neural Networks (DyNN) adapt its network architecture at run-time compared to static neural networks. DyNNs benefit in wireless receivers based on neural networks (or neural receivers), that need to adapt their performance under varying channel conditions. Mixture-of-Experts (MoE) is one efficient way to realize a DyNN in neural receivers in which several smaller expert networks are dynamically combined in the run-time to create a dedicated network according to the channel requirements. This paper presents a novel hard-gated (also known as sparsely-gated) MoE-based neural receiver architecture called MEAN for Single Input Multiple Output (SIMO)-based wireless communication systems. Our proposed MEAN architecture for a SIMO wireless system shows a reduction of up to 50% in the number of active layers during runtime.
Originele taal-2Engels
Aantal pagina's4
StatusE-publicatie vóór gedrukte publicatie - 6 okt. 2024
EvenementIFIP/IEEE International Conference on Very Large Scale Integration, VLSI-SoC 2024
- Morocco, Tanger, Marokko
Duur: 6 okt. 20249 okt. 2024
https://vlsisoc2024.nl/

Congres

CongresIFIP/IEEE International Conference on Very Large Scale Integration, VLSI-SoC 2024
Verkorte titelVLSI-SoC 2024
Land/RegioMarokko
StadTanger
Periode6/10/249/10/24
Internet adres

Vingerafdruk

Duik in de onderzoeksthema's van 'MEAN: Mixture-of-Experts Based Neural Receiver'. Samen vormen ze een unieke vingerafdruk.

Citeer dit