Pruning-Aware Loss Functions for STOI-Optimized Pruned Recurrent Autoencoders for the Compression of the Stimulation Patterns of Cochlear Implants at Zero Delay

authored by
Reemt Hinrichs, Jörn Ostermann
Abstract

Cochlear implants (CIs) are surgically implanted hearing devices, which allow to restore a sense of hearing in people suffering from profound hearing loss. Wireless streaming of audio from external devices to CI signal processors has become common place. Specialized compression based on the stimulation patterns of a CI by deep recurrent autoencoders can decrease the power consumption in such a wireless streaming application through bit-rate reduction at zero latency. While previous research achieved considerable bit-rate reductions, model sizes were ignored, which can be of crucial importance in hearing-aids due to their limited computational resources. This work investigates maximizing objective speech intelligibility of the coded stimulation patterns of deep recurrent autoencoders while minimizing model size. For this purpose, a pruning-aware loss is proposed, which captures the impact of pruning during training. This training with a pruning-aware loss is compared to conventional magnitude-informed pruning and is found to yield considerable improvements in objective intelligibility, especially at higher pruning rates. After fine-tuning, little to no degradation of objective intelligibility is observed up to a pruning rate of about 55 %. The proposed pruning-aware loss yields substantial gains in objective speech intelligibility scores after pruning compared to the magnitude-informed baseline for pruning rates above 45 %.

Organisation(s)
Institute of Information Processing
Type
Conference contribution
Pages
1427-1432
No. of pages
6
Publication date
27.10.2024
Publication status
Published
Peer reviewed
Yes
ASJC Scopus subject areas
Signal Processing, Computer Networks and Communications
Sustainable Development Goals
SDG 3 - Good Health and Well-being
Electronic version(s)
https://doi.org/10.1109/IEEECONF60004.2024.10943066 (Access: Closed)
https://doi.org/10.48550/arXiv.2502.02424 (Access: Open)