Sleep is one of the most fundamental behaviors for animals and humans, and understanding group sleep will provide key insights into neuroscience and social behavior and interactions. To overcome limitations stemmed from single modality animal behavior platforms, the project will develop a multimodal machine learning method to simultaneously monitor and process the Electroencephalogram (EEG) data and animal behavior data to systematically study group behavior, especially sleep, and to annotate animal social movements/behavior. The outcomes from the project will potentially provide a powerful toolkit based on deep learning to make sense of complex animal behavior and EEG activity pattern for mechanistic exploration. Subproblems from this project will be developed into course materials and will be capstone projects or directed study for undergraduate students.<br/><br/>The project will process multiple data modalities and group activities involving multiple entities from multiple data sources through a multi-modal machine learning framework enabling the extraction and aggregation of the most pertinent information. A “dictionary” of movements at the semantic level will be developed for learning and processing of long video and EEG data, which is a significant challenge for current state-of-the-art self-attention transformer models. Additionally to incorporate group interactions, transformer models for dialogue modeling will be developed. The proposed simultaneous EEG and behavior study will provide biological underpinnings of group sleep, leading to insights into brain electrical signaling and behavioral outputs - how the brain marshals its signaling units to generate behaviors.<br/><br/>This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.