Subject Areas : journal of Artificial Intelligence in Electrical Engineering
Mahnam Mirzae 1 , Mahdi Azarnoosh 2 , Hamidreza Kobravi 3
1 -
2 -
3 -
Keywords:
Abstract :
[1] G. Chan et al., “Multi-site photoplethysmography technology for blood pressure assessment: Challenges and recommendations,” Journal of Clinical Medicine, vol. 8, no. 11. 2019. doi: 10.3390/jcm8111827.
[2] S. TIVATANSAKUL and M. OHKURA, “Emotion Recognition using ECG Signals with Local Pattern Description Methods,” Int. J. Affect. Eng., vol. 15, no. 2, 2016, doi: 10.5057/ijae.ijae-d-15-00036.
[3] S. H. Ahmed, E. Nabil, and A. A. Badr, “Detection of visual positive sentiment using PCNN,” Int. J. Adv. Comput. Sci. Appl., vol. 10, no. 1, 2019, doi: 10.14569/IJACSA.2019.0100134.
[4] T. Niu, S. Zhu, L. Pang, and A. Elsaddik, “Sentiment analysis on multi-view social data,” in Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 2016. doi: 10.1007/978-3-319-27674-8_2.
[5] C. Sitaula, Y. Xiang, A. Basnet, S. Aryal, and X. Lu, “Tag-based semantic features for scene image classification,” in Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 2019. doi: 10.1007/978-3-030-36718-3_8.
[6] A. Hassan and N. Pinkwart, “On the Adaptability and Applicability of Multi-touch User Interfaces Addressing Behavioral Interventions for Children with Autism,” IETE Technical Review (Institution of Electronics and Telecommunication Engineers, India), vol. 37, no. 2. 2020. doi: 10.1080/02564602.2019.1590164.
[7] R. Andreasson, B. Alenljung, E. Billing, and R. Lowe, “Affective Touch in Human–Robot Interaction: Conveying Emotion to the Nao Robot,” Int. J. Soc. Robot., vol. 10, no. 4, 2018, doi: 10.1007/s12369-017-0446-3.
[8] R. R. Singh, S. Conjeti, and R. Banerjee, “A comparative evaluation of neural network classifiers for stress level analysis of automotive drivers using physiological signals,” Biomed. Signal Process. Control, vol. 8, no. 6, 2013, doi: 10.1016/j.bspc.2013.06.014.
[9] H. P. Da Silva, A. P. Alves, A. Lourenço, A. Fred, I. Montalvão, and L. Alegre, “Towards the detection of deception in interactive multimedia environments,” in Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 2013. doi: 10.1007/978-3-642-39146-0_7.
[10] C. Jones and J. Sutherland, “Acoustic emotion recognition for affective computer gaming,” in Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 2008. doi: 10.1007/978-3-540-85099-1_18.
[11] T. Porat and N. Tractinsky, “Affect as a mediator between web-store design and consumers’ attitudes toward the store,” in Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 2008. doi: 10.1007/978-3-540-85099-1_12.
[12] F. Li, L. Yang, H. Shi, and C. Liu, “Differences in photoplethysmography morphological features and feature time series between two opposite emotions: Happiness and sadness,” Artery Res., vol. 18, 2017, doi: 10.1016/j.artres.2017.02.003.
[13] M. S. Islam, M. Shifat-E-Rabbi, A. M. A. Dobaie, and M. K. Hasan, “PREHEAT: Precision heart rate monitoring from intense motion artifact corrupted PPG signals using constrained RLS and wavelets,” Biomed. Signal Process. Control, vol. 38, 2017, doi: 10.1016/j.bspc.2017.05.010.
[14] R. Firoozabadi, E. D. Helfenbein, and S. Babaeizadeh, “Efficient noise-tolerant estimation of heart rate variability using single-channel photoplethysmography,” J. Electrocardiol., vol. 50, no. 6, 2017, doi: 10.1016/j.jelectrocard.2017.08.020.
[15] A. Sološenko, A. Petrėnas, V. Marozas, and L. Sörnmo, “Modeling of the photoplethysmogram during atrial fibrillation,” Comput. Biol. Med., vol. 81, 2017, doi: 10.1016/j.compbiomed.2016.12.016.
[16] A. Reşit Kavsaoǧlu, K. Polat, and M. Recep Bozkurt, “A novel feature ranking algorithm for biometric recognition with PPG signals,” Comput. Biol. Med., vol. 49, no. 1, 2014, doi: 10.1016/j.compbiomed.2014.03.005.
[17] M. W. Park, C. J. Kim, M. Hwang, and E. C. Lee, “Individual emotion classification between happiness and sadness by analyzing photoplethysmography and skin temperature,” in Proceedings - 2013 4th World Congress on Software Engineering, WCSE 2013, 2013. doi: 10.1109/WCSE.2013.34.
[18] M. Rescigno, M. Spezialetti, and S. Rossi, “Personalized models for facial emotion recognition through transfer learning,” Multimed. Tools Appl., vol.. 79, no. 47–48, 2020, doi: 10.1007/s11042-020-09405-4.
[19] E. G. Dada, D. O. Oyewola, S. B. Joseph, O. Emebo, and O. O. Oluwagbemi, “Facial Emotion Recognition and Classification Using the Convolutional Neural Network-10 (CNN-10),” Appl. Comput. Intell. Soft Comput., vol. 2023, 2023, doi: 10.1155/2023/2457898.
[20] H. A. Shehu, W. Browne, and H. Eisenbarth, “Emotion Categorization from Video-Frame Images Using a Novel Sequential Voting Technique,” in Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 2020. doi: 10.1007/978-3-030-64559-5_49.
[21] F. Z. Salmam, A. Madani, and M. Kissi, “Emotion recognition from facial expression based on fiducial points detection and using neural network,” Int. J. Electr. Comput. Eng., vol. 8, no. 1, 2018, doi: 10.11591/ijece. v8i1. pp52-59.
[22] T. DJARA, “Emotional state recognition using facial expression, voice and physiological signal,” Int. Robot. Autom. J., vol. 4, no. 3, 2018, doi: 10.15406/iratj.2018.04.00115.
[23] A. Goshvarpour and A. Goshvarpour, “Evaluation of Novel Entropy-Based Complex Wavelet Sub-bands Measures of PPG in an Emotion Recognition System,” J. Med. Biol Eng., vol. 40, no. 3, 2020, doi: 10.1007/s40846-020-00526-7.
[24] A. Goshvarpour and A. Goshvarpour, “Poincaré’s section analysis for PPG-based automatic emotion recognition,” Chaos, Solitons and Fractals, vol. 114, 2018, doi: 10.1016/j.chaos.2018.07.035.
[25] Y. K. Lee, O. W. Kwon, H. S. Shin, J. Jo, and Y. Lee, “Noise reduction of PPG signals using a particle filter for robust emotion recognition,” in Digest of Technical Papers - IEEE International Conference on Consumer Electronics, 2011. doi: 10.1109/ICCE-Berlin.2011.6031807.
[26] S. Koelstra et al., “DEAP: A database for emotion analysis; Using physiological signals,” IEEE Trans. Affect. Comput., vol. 3, no. 1, 2012, doi: 10.1109/T-AFFC.2011.15.
[27] M. Estrada and A. Stowers, “Amplification of Heart Rate in Multi-Subject Videos”.
[28] Y. Ouzar, F. Bousefsaf, D. Djeldjli, and C. Maaoui, “Video-based multimodal spontaneous emotion recognition using facial expressions and physiological signals,” IEEE Comput. Soc. Conf. Comput. Vis. Pattern Recognit. Work., vol. 2022-June, pp. 2459–2468, 2022, doi: 10.1109/CVPRW56347.2022.00275.
[29] “Face Recognition Using Imperfect Data with FW-MPM-LSTM Method,” pp. 1–15.
[30] P. Kumar and X. Li, “Interpretable Multimodal Emotion Recognition using Facial Features and Physiological Signals,” pp. 1–5, 2023, [Online]. Available: http://arxiv.org/abs/2306.02845
[31] K. Ali and C. E. Hughes, “A Unified Transformer-based Network for multimodal Emotion Recognition,” vol. 14, no. 8, 2023, [Online]. Available: http://arxiv.org/abs/2308.14160
[32] J. Kwon, J. Ha, D. H. Kim, J. W. Choi, and L. Kim, “Emotion Recognition Using a Glasses-Type Wearable Device via Multi-Channel Facial Responses,” IEEE Access, vol. 9, 2021, doi: 10.1109/ACCESS.2021.3121543.
[33] Z. Yu, X. Li, and G. Zhao, “Remote photoplethysmograph signal measurement from facial videos using spatio-temporal networks,” in 30th British Machine Vision Conference 2019, BMVC 2019, 2020.
[34] Q. Fan and K. Li, “Non-contact remote estimation of cardiovascular parameters,” Biomed. Signal Process. Control, vol. 40, 2018, doi: 10.1016/j.bspc.2017.09.022.
