Back close

Spoken-Intent Classification using Hybrid Activation Function

Publication Type : Conference Paper

Publisher : IEEE

Source : 2025 International Conference on Robotics and Mechatronics (ICRM)

Url : https://doi.org/10.1109/icrm66809.2025.11349071

Campus : Amritapuri

School : School of Engineering

Center : Humanitarian Technology (HuT) Labs

Department : Electronics and Communication

Year : 2025

Abstract : Efficient models of speech-intent in edge robots require, the activation functions must provide a tradeoff between computational efficiency and strong gradient flow. Rectified Linear Units (ReLU) are cheap yet cause what we call dead neurons and Gaussian Error Linear Units (GELU) give you a better gradient at a greater arithmetic cost. We introduce ReLU GELU Fusion, a joint activation which forwards the positive inputs to ReLU and negative inputs to GELU, and retains the advantage of both. Some of these are incorporated in a lightweight convolutional network and tested against the SpokeN-100 English corpus.

Cite this Research Publication : K Adithya Hegde, Rajesh Kannan Megalingam, Spoken-Intent Classification using Hybrid Activation Function, 2025 International Conference on Robotics and Mechatronics (ICRM), IEEE, 2025, https://doi.org/10.1109/icrm66809.2025.11349071

Admissions Apply Now