Group Emotion Recognition using deep neural networks and Bayesian classifiers.
GPL-3.0 License
This project aims to classify a group’s perceived emotion as Positive, Neutral or Negative. The dataset being used is the Group Affect Database 3.0 which contains "in the wild" photos of groups of people in various social environments.
Our solution is a hybrid machine learning system that builds on the model by Surace et al. and extends it further with additional and more refined machine learning methods and experiments. It has been published in the paper Group Emotion Recognition Using Machine Learning.
This repository consists of 3 branches -
master
- contains the code used to train and test the model.webapp
- contains the webapp.android
- contains the android app.pip install -r requirements.txt
to install the requirements.python classify_image.py image_dir original_label
to classify an image as Positive, Neutral or Negative. eg - python classify_image.py input/val/Positive/ Positive
classifies the images in the input/val/Positive/
directory with original label as Positive
.git checkout webapp
to switch to the webapp branch.pip install -r requirements.txt
to install the requirements.python manage.py runserver
to start the server. Frontend can be accessed at http://127.0.0.1:5000
So, first of all, why do we need emotion recognition?
Emotion recognition is important -
Emotion Recognition has applications in crowd analytics, social media, marketing, event detection and summarization, public safety, human-computer interaction, digital security surveillance, street analytics, image retrieval, etc.
The problem of emotion recognition for a group of people has been less extensively studied, but it is gaining popularity due to the massive amount of data available on social networking sites containing images of groups of people participating in social events.
Group emotion recognition is a challenging problem due to obstructions like head and body pose variations, occlusions, variable lighting conditions, variance of actors, varied indoor and outdoor settings and image quality.
Our solution is a pipeline based approach which integrates two modules (that work in parallel): bottom-up and top-down modules, based on the idea that the emotion of a group of people can be deduced using both bottom-up and top-down approaches.