Photo Meals — Using LLMs to Manage Nutrition
I do think a challenge in nutrition management today is around the pain of data entry. I don’t want to weigh all my groceries. And, as a bit of a foodie, it’s quite difficult to get nutrition data while dining at a restaurant. But I always take a photo of what I’m eating.
About a year ago I put together a simple demo where I took a photo of food and used AI to get all the nutritional information. It was a fairly clever idea, able to fetch calories but also a variety of macronutrients.
I didn’t want to deal with managing health data, particularly given privacy sensitivities, so I just posted the data into Google Fit using their web API. However, that API has since been deprecated, so the entire project had to be reimagined.
And it has been! Photo Meals is now available to download on Android.
It does require a monthly subscription to run, since the multimodal LLM isn’t free to me either. But for your payment you do get unlimited access to take a photo and get a good nutritional estimate.
I’m not asking the LLM for calories directly. I use it in order to get an estimate of what is on your plate and then can deterministically lookup everything. You can adjust the portion sizes afterwards.
Similar to my earlier prototype, data gets saved into your phone’s Health Connect service. This allows your health data to get plugged in securely to a broader ecosystem of health apps. This way, your data stays on-device and can be paired with new innovative experiences to find greater insights.
You should definitely install this app and give it a try.