Facial expressions play a crucial role in human communication by conveying the intentions of others. With the advancement in computer vision tasks and the use of Deep Neural Networks (DNNs), Facial Expression Recognition (FER) has shown significant progress. When provided with only a face, identifying emotions becomes possible through facial expressions. Automatic facial expression recognition has gained substantial attention in recent decades, as it has significant potential for both academic and commercial purposes.
This technology can be super helpful in many ways. For example, mental health analysis. It can also make video games and movies more fun by changing how characters react based on the player's emotions. It can be used in advertising and marketing research, and so on.
The dataset utilized for this project comprises images capturing individuals exhibiting seven distinct emotions: anger, contempt, disgust, fear, happiness, sadness, and surprise. Each image represents one of these specific emotions to explore and develop models for emotion recognition and analysis.
We begin by loading and preprocessing the images, converting them into a format suitable for training our model. With the data prepared, we construct a Neural Network model. The model comprises layers designed to extract features from the input images and classify them into emotion categories. The model is trained on a portion of the dataset, with the remaining portion reserved for evaluation. Once trained, we evaluate the model's performance on the test dataset to assess its ability to generalize to unseen data.