Brain-computer interfaces (BCIs) are incredibly helpful for physically disabled users, allowing them to provide input through controlling their brain activity, which is sensed through specialized hardware. Such hardware is rapidly becoming less expensive and obtrusive, paving the way for developing BCIs for a wider audience. This project is about developing an infrastructure for BCI-based sensing of a person's cognitive workload, then using this information to structure how the computer interacts with the person. The infrastructure will enable research around several kinds of adaptive interface, including educational tools that use brain activity to make lessons harder or easier and ways to evaluate the comprehensibility and difficulty of data visualizations. The research has the potential to impact society by improving education, data analysis, and interfaces for both people in general and people with disabilities. Further, the lead researcher will use the infrastructure and enabled research to attract students from groups traditionally underrepresented in computer science, as well as to support courses on affective computing and data visualization.<br/><br/>The funding supports the infrastructure necessary to build and test adaptive user interfaces that respond intelligently to user cognitive state in real-time. In particular, the PIs will acquire a multichannel frequency domain fNIRS (functional near infrared spectroscopy) device and develop the algorithms required to process fNIRS signals to extract workload information. Signals will be pre-processed to account for motion artifacts and to derive meaningful features (notably, mean and linear regression slope based no the literature) for each of the 16 channels provided by the fNIRS device. The team plans to use personalized support vector machine-based models to distinguish between high and low workload states, based on prior work by the PIs that shows the effectiveness of this approach.