In this program, the investigators will develop novel multimodal neural-behavioral-physiological monitoring tools (software and hardware), and machine learning models for mental states within social processes and beyond.
The tools consist of a multimodal skin-like wearable sensor for physiological and biochemical sensing; a conversational virtual human platform to evoke naturalistic social processes; audiovisual affect recognition software; synchronization tools; and machine learning methods to model the multimodal data.
The investigators will demonstrate the tools in healthy subjects without neural recordings and in patients with drug-resistant epilepsy who already have intracranial EEG (iEEG) electrodes implanted based on clinical criteria for standard monitoring to localize seizures, which is unrelated to our study.