Analysis of databases of facial expressions for the automatic identification of learning-centered emotions

Keywords: Databases, Automatic identification of emotions, Facial expressions

Abstract

This work presents the analysis of the state of the art of facial expressions databases for the automatic identification of learning-centered emotions. Obtaining data for automatic recognition processes in a specific context is essential for their success. Thus, this project begins by reviewing the information available to carry out the training and classification stages of emotions with the proposed computational techniques. The search activities of the databases of facial expressions that capture learning-centered emotions are described. These activities were part of the stages of the work methodology to recognize students' emotions while they carried out online learning activities. This allowed justifying the creation of the database, formalizing a protocol from its capture to its digitization.

References

Aifanti, N., Papachristou, C., & Delopoulos, A. (2010). The MUG facial expression database. 1th International Workshop on Image Analysis for Multimedia Interactive Services WIAMIS 10, 1–4.

Almohammadi, K., Hagras, H., Yao, B., Alzahrani, A., Alghazzawi, D., & Aldabbagh, G. (2017). A type-2 fuzzy logic recommendation system for adaptive teaching. Soft Computing, 21(4). https://doi.org/10.1007/s00500-015-1826-y

Aneja, D., Colburn, A., Faigin, G., Shapiro, L., & Mones, B. (2017). Modeling Stylized Character Expressions via Deep Learning. In Computer Vision – ACCV 2016. ACCV 2016. Lecture Notes in Computer Science (Vol. 10112). Springer, Cham. https://doi.org/10.1007/978-3-319-54184-6_9

Arana-Llanes, J. Y., González-Serna, G., Pineda-Tapia, R., Olivares-Peregrino, V., Ricarte-Trives, J. J., & Latorre-Postigo, J. M. (2018). EEG lecture on recommended activities for the induction of attention and concentration mental states on e-learning students. Journal of Intelligent & Fuzzy Systems, 34(5). https://doi.org/10.3233/JIFS-169517

Arroyo, I., Cooper, D. G., Burleson, W., Woolf, B. P., Muldner, K., & Christopherson, R. (2009). Emotion sensors go to school. Artificial Intelligence in Education, 17–24.

Barrón-Estrada, M. L., Zatarain-Cabada, R., Aispuro-Medina, B. G., Valencia-Rodríguez, E. M., & Lara-Barrera, A. C. (2016). Building a Corpus of Facial Expressions for Learning-Centered Emotions. Research in Computing Science, 129, 45–52.

Bixler, R., & D’Mello, S. (2013). Towards Automated Detection and Regulation of Affective States During Academic Writing. In Artificial Intelligence in Education. AIED 2013. Lecture Notes in Computer Science (Vol. 7926). Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-39112-5_142

Bosch, N., & D’Mello, S. (2017). The Affective Experience of Novice Computer Programmers. International Journal of Artificial Intelligence in Education, 27(1). https://doi.org/10.1007/s40593-015-0069-5

Bosch, N., D’Mello, S. K., Baker, R. S., Ocumpaugh, J., Shute, V., Ventura, M., Wang, L., & Zhao, W. (2016a). Detecting student emotions in computer-enabled classrooms. IJCAI International Joint Conference on Artificial Intelligence, 4125–4129.

Bosch, N., D’Mello, S. K., Ocumpaugh, J., Baker, R. S., & Shute, V. (2016b). Using Video to Automatically Detect Learner Affect in Computer-Enabled Classrooms. ACM Transactions on Interactive Intelligent Systems, 6(2). https://doi.org/10.1145/2946837

Botelho, A. F., Baker, R. S., & Heffernan, N. T. (2017). Improving Sensor-Free Affect Detection Using Deep Learning. In Artificial Intelligence in Education. AIED 2017. Lecture Notes in Computer Science (Vol. 10331). https://doi.org/10.1007/978-3-319-61425-0_4

Cabada, R., Barrón, M., & Olivares, J. M. (2014). Reconocimiento automático y aspectos éticos de emociones para aplicaciones educativas. Inteligencia Artificial: Una Reflexión Obligada.

Cornelius, R. R. (1996). The science of emotion: Research and tradition in the psychology of emotions. Prentice-Hall, Inc.

Cowie, R., Douglas-Cowie, E., Tsapatsoulis, N., Votsis, G., Kollias, S., Fellenz, W., & Taylor, J. G. (2001). Emotion recognition in human-computer interaction. IEEE Signal Processing Magazine, 18(1). https://doi.org/10.1109/79.911197

Ekman, P. (2004). Emotions Revealed. Recognizing Faces and Feelings to Improve Communication and Emotional Life. Henrry Holt and Company.

Ekman, P., Friesen, W., & Hager, J. (2002). Facial action coding system: Research nexus network research information.

el Kaliouby, R., & Picard, R. W. (2019). Affectiva Database. MIT Media Laboratory. https://www.affectiva.com/

Freitas-Magalhães, A. (2021). Facial Action Coding System 4.0-Manual of Scientific Codification of the Human Face. Leya.

Fuentes, C., Herskovic, V., Rodríguez, I., Gerea, C., Marques, M., & Rossel, P. O. (2017). A systematic literature review about technologies for self-reporting emotional information. Journal of Ambient Intelligence and Humanized Computing, 8(4). https://doi.org/10.1007/s12652-016-0430-z

González Meneses, Y. N., Guerrero García, J., Reyes García, C. A., Olmos Pineda, I., & González Calleros, J. M. (2019). Methodology for Automatic Identification of Emotions in Learning Environments. Research in Computing Science, 148(5), 89–96. https://doi.org/10.1088/0031-9112/29/6/013

González-Hernández, F., Zatarain-Cabada, R., Barrón-Estrada, M. L., & Rodríguez-Rangel, H. (2018). Recognition of learning-centered emotions using a convolutional neural network. Journal of Intelligent & Fuzzy Systems, 34(5). https://doi.org/10.3233/JIFS-169514

Graesser, A. C., & D’Mello, S. (2012). Emotions During the Learning of Difficult Material. Psychology of Learning and Motivation, 57. https://doi.org/10.1016/B978-0-12-394293-7.00005-4

Gupta, A., D’Cunha, A., Awasthi, K., & Balasubramanian, V. (2016). DAiSEE: Towards User Engagement Recognition in the Wild. ArXiv Preprint.

Happy, S. L., Patnaik, P., Routray, A., & Guha, R. (2017). The Indian Spontaneous Expression Database for Emotion Recognition. IEEE Transactions on Affective Computing, 8(1). https://doi.org/10.1109/TAFFC.2015.2498174

Harley, J. M., Bouchet, F., & Azevedo, R. (2013). Aligning and Comparing Data on Emotions Experienced during Learning with MetaTutor. In Artificial Intelligence in Education. AIED 2013. Lecture Notes in Computer Science (Vol. 7926). Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-39112-5_7

Hill, D. (2010). Emotionomics: Leveraging emotions for business success.

Hjortsj, C. H., Ekman, P., Friesen, W. v., Hager, J. C., Facs, F., & Facs, F. M. (2019). Sistema de Codificación Facial.

Koelstra, S., Muhl, C., Soleymani, M., Jong-Seok Lee, Yazdani, A., Ebrahimi, T., Pun, T., Nijholt, A., & Patras, I. (2012). DEAP: A Database for Emotion Analysis; Using Physiological Signals. IEEE Transactions on Affective Computing, 3(1). https://doi.org/10.1109/T-AFFC.2011.15

Langner, O., Dotsch, R., Bijlstra, G., Wigboldus, D. H. J., Hawk, S. T., & van Knippenberg, A. (2010). Presentation and validation of the Radboud Faces Database. Cognition & Emotion, 24(8). https://doi.org/10.1080/02699930903485076

Livingstone, S. R., & Russo, F. A. (2018). The Ryerson Audio-Visual Database of Emotional Speech and Song (RAVDESS): A dynamic, multimodal set of facial and vocal expressions in North American English. PLOS ONE, 13(5). https://doi.org/10.1371/journal.pone.0196391

Lucey, P., Cohn, J. F., Kanade, T., Saragih, J., & Ambadar, Z. (2010). The extended Cohn-Kanade dataset (CK+): a complete facial expression dataset for action unit and emotion-specified expression Conference on. Computer Vision and Pattern Recognition Workshops (CVPRW), 2010 IEEE Computer Society Conference On, July, 94–101.

Lyons, M. J., Kamachi, M., & Gyoba, J. (2014). Japanese Female Facial Expressions (JAFFE), Database of digital images. http://www.kasrl.org/jaffe_info.html

Mavadati, S. M., Mahoor, M. H., Bartlett, K., Trinh, P., & Cohn, J. F. (2013). DISFA: A Spontaneous Facial Action Intensity Database. IEEE Transactions on Affective Computing, 4(2). https://doi.org/10.1109/T-AFFC.2013.4

Mehmood, R., & Lee, H. (2017). Towards Building a Computer Aided Education System for Special Students Using Wearable Sensor Technologies. Sensors, 17(317), 1–22. https://doi.org/10.3390/s17020317

Mena-Chalco, J. P., Cesar-Jr, R., & Velho, L. (2008). Banco de dados de faces 3D: IMPA-FACE3D. IMPA-RJ, Tech. Rep.

Mohamad Nezami, O., Dras, M., Hamey, L., Richards, D., Wan, S., & Paris, C. (2020). Automatic Recognition of Student Engagement Using Deep Learning and Facial Expression. In Machine Learning and Knowledge Discovery in Databases. ECML PKDD 2019. Lecture Notes in Computer Science. https://doi.org/10.1007/978-3-030-46133-1_17

Mollahosseini, A., Hasani, B., & Mahoor, M. H. (2019). AffectNet: A Database for Facial Expression, Valence, and Arousal Computing in the Wild. IEEE Transactions on Affective Computing, 10(1). https://doi.org/10.1109/TAFFC.2017.2740923

Monkaresi, H., Bosch, N., Calvo, R. A., & D’Mello, S. K. (2017). Automated Detection of Engagement Using Video-Based Estimation of Facial Expressions and Heart Rate. IEEE Transactions on Affective Computing, 8(1). https://doi.org/10.1109/TAFFC.2016.2515084

Nye, B., Karumbaiah, S., Tokel, S. T., Core, M. G., Stratou, G., Auerbach, D., & Georgila, K. (2017). Analyzing Learner Affect in a Scenario-Based Intelligent Tutoring System. In Artificial Intelligence in Education. AIED 2017. Lecture Notes in Computer Science (Vol. 10331). Springer, Cham. https://doi.org/10.1007/978-3-319-61425-0_60

Picard, R. W. (2000). Affective Computing. MIT Press.

Picard, R. W. (2003). Affective computing: challenges. International Journal of Human-Computer Studies, 59(1–2). https://doi.org/10.1016/S1071-5819(03)00052-1

Scherer, K. R. (2005). What are emotions? And how can they be measured? Social Science Information, 44(4). https://doi.org/10.1177/0539018405058216

Sneddon, I., McRorie, M., McKeown, G., & Hanratty, J. (2012). The Belfast Induced Natural Emotion Database. IEEE Transactions on Affective Computing, 3(1). https://doi.org/10.1109/T-AFFC.2011.26

Steidl, S. (2009). Automatic classification of emotion related user states in spontaneous children’s speech. Logos-Verlag.

Thomaz, C. E. (2012). FEI Face Database. https://fei.edu.br/~cet/facedatabase.html

Valstar, M., & Pantic, M. (2010). Induced disgust, happiness and surprise: an addition to the MMI facial expression database. Proc. 3rd Intern. Workshop on EMOTION (Satellite of LREC): Corpora for Research on Emotion and Affect.

Xiao, X., Pham, P., & Wang, J. (2017). Dynamics of Affective States During MOOC Learning. In Artificial Intelligence in Education. AIED 2017. Lecture Notes in Computer Science (Vol. 10331). Springer, Cham. https://doi.org/10.1007/978-3-319-61425-0_70

Zafeiriou, S., Kollias, D., Nicolaou, M. A., Papaioannou, A., Zhao, G., & Kotsia, I. (2017). Aff-wild: valence and arousal’In-the-Wild’challenge. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition Workshops, 34–41.

Zatarain-Cabada, R., Barrón-Estrada, M. L., González-Hernández, F., Oramas-Bustillos, R., Alor-Hernández, G., & Reyes-García, C. A. (2017b). Building a Corpus and a Local Binary Pattern Recognizer for Learning-Centered Emotions. In Advances in Soft Computing. MICAI 2016. Lecture Notes in Computer Science (Vol. 10062). https://doi.org/10.1007/978-3-319-62428-0_43

Zatarain-Cabada, R., Barron-Estrada, M. L., Gonzalez-Hernandez, F., & Rodriguez-Rangel, H. (2017a). Building a Face Expression Recognizer and a Face Expression Database for an Intelligent Tutoring System. 2017 IEEE 17th International Conference on Advanced Learning Technologies (ICALT). https://doi.org/10.1109/ICALT.2017.141

Zhao, G., Huang, X., Taini, M., Li, S. Z., & Pietikäinen, M. (2011). Facial expression recognition from near-infrared videos. Image and Vision Computing, 29(9). https://doi.org/10.1016/j.imavis.2011.07.002
How to Cite
González-Meneses , Y. N., & Guerrero-García, J. (2021). Analysis of databases of facial expressions for the automatic identification of learning-centered emotions. Revista Colombiana De Computación, 22(2), 58–71. https://doi.org/10.29375/25392115.4300

Downloads

Download data is not yet available.
Published
2021-12-01
Section
Article of scientific and technological research

Altmetric

Escanea para compartir
QR Code
Crossref Cited-by logo

Some similar items: