intelligent systems
A. V. Atanassov, D. Pilev, F. Tomova. Improving the Accuracy of Facial Emotion Recognition through Deep Neural Networks for Facial Emotions and Weather Conditions Recognition

Key Words: Deep Neural Networks; Facial Emotions Recognition; Weather Condition Recognition; Python.

Abstract. Emotions are one of the main ways to communicate between people and to express their attitude towards objects, products, services, etc. Emotions are divided into two classes – verbal and non-verbal. Human speech and intonation belong to the first class, and to the second class are facial and body emotions, also known as body language. The subject of this paper is facial emotions and their relationship to the scene in which they occur. A number of studies have established that there is a strong relationship between a person’s emotions and their surroundings. The latter includes meteorological conditions (weather) and other objects, such as other people, landscape, etc. Facial emotions range (FER) from seven basic emotions (joy, anger, surprise, fear, sadness, neutral and disgust and neutral) categorized by P. Ekman through his Facial Action Coding System to 26 emotions represented by Russell through his 3D Valence Arousal Dominance model. Most of the existing deep neural networks for Facial Emotions Recognition recognize mentioned seven emotions. In our previous research, we presented a pre-trained FER model with 69.85% accuracy. Weather conditions are closely related to geographic regions and vary in some cases from sunny to cloudy, or in other cases include some subset of sunny, foggy, snowy, rainy, hot, etc. In this research, we analyze deep learning neural networks, for weather conditions recognition and selected appropriate model. We combined our FER DNN with the selected weather recognition DNN and build a bimodal system, which improves facial emotion recognition to 80-83% especially in the cases when FER model provides contradictory results.