Abstract: Calorie estimation is crucial for dietary tracking and health management. This work presents a novel method that combines depth prediction and feature fusion to improve estimation accuracy. In order to accurately estimate the volume of food items, deep learning algorithms are used to forecast depth information. To enhance calorie computation, the retrieved depth, RGB, and texture data are subsequently combined utilizing sophisticated aggregation techniques. The suggested technique gets over the drawbacks of conventional 2D-based methods by utilizing RGB-D pictures, which results in more precise calorie and portion size estimation. According to experimental results, this technology performs better than traditional approaches and achieves a higher level of precision in calorie measurement.In real-time applications, robustness and efficiency are guaranteed by the combination of machine learning and multimodal data fusion. Prior research has confirmed the benefits of using depth information for volume estimate and demonstrated the efficacy of depth-based methods in food analysis. This approach may find use in healthcare, automated nutrition monitoring, and customized diet management programs. This method helps create intelligent health-monitoring tools that help users maintain balanced eating habits by increasing the accuracy of calorie calculation.
Keywords: Deep learning, CNN,Graph Neural Networks, food calorie estimation, depth prediction, feature fusion, RGB-D pictures, nutritional analysis, multimodal data fusion, machine learning, food volume estimation, real-time applications, dietary tracking, healthcare, and AI-driven models
|
DOI:
10.17148/IARJSET.2025.12304