| International Journal of Computer Applications |
| Foundation of Computer Science (FCS), NY, USA |
| Volume 187 - Number 93 |
| Year of Publication: 2026 |
| Authors: Andrei Carl L. Castro, Nathan Sheary G. Muñoz, Neo Jezer A. Pare, Joey S. Aviles |
10.5120/ijca2026926609
|
Andrei Carl L. Castro, Nathan Sheary G. Muñoz, Neo Jezer A. Pare, Joey S. Aviles . NutriSnap: Mobile-based Food Recognition with Caloric and Macronutrient Estimation using MobileNetV2 and YOLOv8n. International Journal of Computer Applications. 187, 93 ( Mar 2026), 31-37. DOI=10.5120/ijca2026926609
This paper presents NutriSnap, a mobile-based food recognition system that estimates caloric and macronutrient content from user-captured images. The system integrates MobileNetV2 for image classification and YOLOv8n for object detection in a modular two-stage pipeline. Trained on the Food-101, UEC-256, Food2K, and a custom Filipino food dataset (Phil23), the MobileNetV2 model achieved a Top-1 validation accuracy of 73.19% across 189 food categories, with a macro-averaged F1-score of 0.73 and a weighted F1 of 0.73. The YOLOv8n model, trained using a three-stage fine-tuning approach with synthetic data augmentation, attained 96.1% precision, 92.9% recall, and 97.3% mAP50 on the validation set. Both models were converted to TensorFlow Lite (TFLite) and integrated into a Flutter-based Android application. Nutritional values are retrieved from the USDA FoodData Central and Philippine Food Composition Tables (PhilFCT) databases using a proportion-based formula keyed to user-entered serving weight. System usability was evaluated using the System Usability Scale (SUS) with 68 participants, yielding a mean score of 80.62, categorized as "Excellent." Comprehensive experimental results, including training convergence curves, per-class performance analysis, stage-wise detection metrics, and comparative evaluation against related works, demonstrate that the integrated pipeline is effective for real-time dietary monitoring on resource-constrained mobile devices.