Samenvatting
Dynamic Neural Networks (DyNN) adapt its network architecture at run-time compared to static neural networks. DyNNs benefit in wireless receivers based on neural networks (or neural receivers), that need to adapt their performance under varying channel conditions. Mixture-of-Experts (MoE) is one efficient way to realize a DyNN in neural receivers in which several smaller expert networks are dynamically combined in the run-time to create a dedicated network according to the channel requirements. This paper presents a novel hard-gated (also known as sparsely-gated) MoE-based neural receiver architecture called MEAN for Single Input Multiple Output (SIMO)-based wireless communication systems. Our proposed MEAN architecture for a SIMO wireless system shows a reduction of up to 50% in the number of active layers during runtime.
Originele taal-2 | Engels |
---|---|
Aantal pagina's | 4 |
Status | E-publicatie vóór gedrukte publicatie - 6 okt. 2024 |
Evenement | IFIP/IEEE International Conference on Very Large Scale Integration, VLSI-SoC 2024 - Morocco, Tanger, Marokko Duur: 6 okt. 2024 → 9 okt. 2024 https://vlsisoc2024.nl/ |
Congres
Congres | IFIP/IEEE International Conference on Very Large Scale Integration, VLSI-SoC 2024 |
---|---|
Verkorte titel | VLSI-SoC 2024 |
Land/Regio | Marokko |
Stad | Tanger |
Periode | 6/10/24 → 9/10/24 |
Internet adres |