APRIL 3, 2026
Jiachen Li, MS
PhD Candidate
Khoury College of Computer Sciences
Northeastern University
About the Presentation: With the growing prevalence of modern ubiquitous computing technologies, multi-modal sensing systems can capture the daily activities and personal health of individuals, holding promise for data-sharing scenarios such as providing timely awareness and reassurance to family members assisting older adults with aging in place. However, the presentation of this sensing information is often rigid, isolated, and metrics-driven, and lacks customization, typically emphasizing “what” the person is doing through numbers and simple visualizations (e.g., step counts). Recent advances in Large Language Models (LLMs) have introduced new possibilities to transform complex sensing data into cohesive, high-level narrative accounts through sophisticated sensemaking, helping answer the key question, “How is my [X] doing?” This is particularly crucial for stakeholders such as remote family members of older adults, who possess strong emotional responsibility yet have limited visibility into their daily lives and limited capacity for caregiving.
In this talk, I focus on how information depicting daily activities and personal health, captured through multi-modal sensing data, should be understood and shared between older adults and remote family members with the assistance of AI.
About the Presenter: Jiachen Li is a PhD candidate at the Khoury College of Computer Sciences at Northeastern University, advised by Prof. Varun Mishra and Prof. Beth Mynatt. Her research focuses on human-AI collaboration in healthcare and clinical settings.