International Journal of Computer Applications |
Foundation of Computer Science (FCS), NY, USA |
Volume 186 - Number 81 |
Year of Publication: 2025 |
Authors: Arshia Kermani, Ehsan Zeraatkar, Habib Irani |
![]() |
Arshia Kermani, Ehsan Zeraatkar, Habib Irani . Energy-Efficient Transformer Inference: Optimization Strategies for Time Series Classification. International Journal of Computer Applications. 186, 81 ( Apr 2025), 1-9. DOI=10.5120/ijca2025924771
The increasing computational demands of transformer models in time series classification necessitate effective optimization strategies for energy-efficient deployment. Our study presents a systematic investigation of optimization techniques, focusing on structured pruning and quantization methods for transformer architectures. Through extensive experimentation on three distinct datasets (RefrigerationDevices, ElectricDevices, and PLAID), model performance and energy efficiency are quantitatively evaluated across different transformer configurations. Our experimental results demonstrate that static quantization reduces energy consumption by 29.14% while maintaining classification performance, and L1 pruning achieves a 63% improvement in inference speed with minimal accuracy degradation. Our findings provide valuable insights into the effectiveness of optimization strategies for transformerbased time series classification, establishing a foundation for efficient model deployment in resource-constrained environments.