Thousands of first-person photos analyzed by AI can predict a person's chronic stress levels based on the greenness of their view.
Environmental features in a person's actual daily visual field correlate directly with their mental health. The vague idea that nature is good for you is now a measurable metric called the visual exposome. AI vision models can quantify exactly how much greenery or urban clutter a person sees throughout their day. This data predicts momentary moods and long-term stress markers more accurately than self-reports. It suggests that urban planning could be treated as a direct intervention for public mental health.
Quantifying the human visual exposome with vision language models
arXiv · 2605.03863
The visual environment is a fundamental yet unquantified determinant of mental health. While the concept of the environmental exposome is well established, current methods rely on coarse geospatial proxies or biased self reports, failing to capture the first person visual context of daily life. We addressed this gap by coupling ecological momentary assessment with vision language models (VLMs) to quantify the semantic richness of human visual experience. Across 2674 participant generated photogr