EEG-NIRS dataset TYUT emotion recognition (ENTER)
The EEG-NIRS dataset TYUT emotion recognition (ENTER), is a collection of EEG-fNIRS datasets evoked by four kinds of emotion video clips. Fifty college students including 25 male (age: 22.92±1.71) and 25 female (age: 24.12±1.67) volunteered for this study. All participants self-reported normal or correct-ed-to-normal vision and normal hearing in the experiments. Each participant gave written informed consent prior to participation, and all of them self-identified as right-handed and self-reported to have no history of mental illnesses or drug, which is the inclusion criteria. The EEG-fNIRS data from all participants were included in the data analysis. Each participant was asked to sit on a comfortable chair facing a computer screen and assigned to watch 60 emotional video clips including sad, happy, calm and fear according to the experimental protocol. Each video clip lasted 1~2 minutes, and then the participant can evaluate the type of emotion within 30s. Each participant was instructed in the experimental procedure in detail before performing the experiment.
Both EEG and fNIRS sensors were placed on the participant’s scalp, and there was no contact between these two hardware devices (Neuroscan SynAmps2 and DanYang HuiChuang NirSmart). The EEG was recorded with 1000 Hz sampling rate and 64 channels, and the fNIRS with 11Hz sampling rate and 18 channels. During the experiment, each participant was required to minimize his head movements to avoid signal artifacts.
ENTER is open for the academic community. If you are interested in the dataset and want to gain the access to it, please download and fill out the license agreement below and send it to chenguijun@tyut.edu.cn. We will send you a download link through email after a review of your application.
此处可能存在不合适展示的内容,页面不予展示。您可通过相关编辑功能自查并修改。
如您确认内容无涉及 不当用语 / 纯广告导流 / 暴力 / 低俗色情 / 侵权 / 盗版 / 虚假 / 无价值内容或违法国家有关法律法规的内容,可点击提交进行申诉,我们将尽快为您处理。