ISSN: A/F

Enhancing Speech Codec Efficiency with Intra-Inter Broad Attention Mechanism (P1-P1)

Abstract

This paper introduces a new approach in speech compression through advanced attention mechanisms, integration of LSTM, and dual-branch conformer structures for optimizing codec efficiency. The study focuses on five research questions, which are: intra-inter broad attention, multi-head attention networks, LSTM for sequence modeling, redundancy elimination, and comparative performance of IBACodec against traditional codecs. The study uses a quantitative methodology with performance metrics that include bitrate efficiency and quality evaluation. Results confirm that IBACodec significantly enhances context awareness, compression efficiency, sequence modeling, and redundancy elimination compared to existing solutions. These findings position IBACodec as a leading solution for speech compression. Further research is needed to explore real-time applications and broader datasets.

Download PDF

How to Cite

Gnanzou, D, (2025/6/30). Enhancing Speech Codec Efficiency with Intra-Inter Broad Attention Mechanism. Abhi International Journal of Information Processing Management, Volume e6oCzzqc1eEBEoi5KW83, Issue 1.