MoodLens passively monitors biometric and linguistic signals from your wearable and phone to detect early signs of depression — before crisis strikes.
Depression leaves digital fingerprints weeks before a crisis. Today, those signals go undetected.
Over 60 million Americans experience a mental illness annually, yet most won't see a mental health professional — held back by stigma, cost, and access. (CDC, 2024)
Four in ten adolescents who experience a major depressive episode receive zero treatment, leaving them vulnerable to worsening outcomes. (Annie E. Casey Foundation, 2024)
Psychiatric care happens in scheduled appointments. Crises don't. There's almost no infrastructure to catch someone in the weeks before they reach a breaking point.
Wearables and smartphones continuously collect sleep patterns, heart rate variability, activity levels, and communication behavior — all known early indicators of depressive episodes.
MoodLens fuses biometric and linguistic signals into a PHQ-9-aligned severity score, then responds proportionally.
Sleep, HRV, steps, SpO2 via Google Health Connect
Logistic regression on Databricks → PHQ Score 1
Weighted combination → Final PHQ-9 Score
Tiered response from affirmations to human responders
Call log metadata & anonymized text patterns
Fine-tuned sentiment analysis → PHQ Score 2
No active intervention is needed. The app delivers personalized affirmations on demand, proactively supporting resilience and emotional well-being.
The user is connected to an empathetic AI voice conversation powered by a large language model and ElevenLabs text-to-speech — a stigma-free, always-available bridge to support.
Emergency contacts, volunteer mental health professionals, or medical responders are alerted and provided with PHQ trend data and flagged linguistic signals for rapid triage.
Every model and methodology is grounded in peer-reviewed clinical literature.
Because ethically labeled biometric-PHQ datasets are scarce, we generated synthetic training data using Spearman correlation coefficients from:
State-of-the-art transformer model fine-tuned on depression-related linguistic patterns. RoBERTa achieves 94–96% accuracy on sentiment benchmarks through dynamic masking, 160 GB training data, and no NSP objective.
Both models are hosted as inference endpoints on Databricks, providing scalable, low-latency predictions from the mobile app with enterprise-grade reliability.
Integrates with Google Health Connect to pull passive biometric data from Pixel Watch, Fitbit, and other supported wearables — no manual logging required.
The moderate-tier intervention uses a large language model paired with ElevenLabs text-to-speech synthesis to deliver empathetic, naturalistic voice-based support conversations.
Mobile-first React application built with Vite for fast builds and HMR. Privacy-first architecture: message content is analyzed locally or anonymized, never stored in raw form.
A multidisciplinary team united by a belief that technology can close the gap between depression and care.
Architected the biometric model pipeline, synthetic data generation workflow.
Databricks model deployement. Agentic AI development.
Built the React mobile frontend and Health Connect integration for passive biometric data collection from wearables.
Whether you're a researcher, clinician, investor, or potential collaborator — we'd love to hear from you.
MoodLens is an open-source Hacklythics project exploring the frontier of passive mental health monitoring. We're looking for research partners, clinical advisors, and anyone passionate about making mental healthcare more proactive.
Thanks for reaching out. We'll get back to you within 48 hours.