In this tutorial, we’ll see how we can create a python program that will detect emotion on a human face. This might be interesting if you want to do things like emotion detection using python, or if you’re training machine learning systems to read human emotions. We’re going to create a program that takes an image as an input and outputs a list of human emotions that the image invokes. To do this, we’re going to use a package called Deepface.

What is Deepface

Deepface is an open-source face recognition attribute analysis framework that was created for python. It is a very powerful computer vision library that’s helpful in identifying things in images, as the shapes and faces within the image, so it’s easy to detect and analyze them. It is a hybrid face recognition framework wrapping model such as VGG-Face, Google FaceNet, OpenFace, Facebook DeepFace, DeepID, ArcFace, and Dlib.

The easiest way to install deepface is to download it from PyPI with the pip commands.

pip3 install deepface

For the purpose of this article, we’re only going to be using one of the many modules that deepface provides which is the Facial Attribute Analysis module, it can tell us about age, gender, facial expression, and race from the provided image or data.

Get emotions on a face from photos

To begin with, we’ll create a small application that will only show the results and in numeric form.

#emotion_detection.py
import cv2
from deepface import DeepFace
import numpy as np  #this will be used later in the process

imgpath = face_img.png'  #put the image where this file is located and put its name here
image = cv2.imread(imgpath)

analyze = DeepFace.analyze(image,actions=['emotions'])  #here the first parameter is the image we want to analyze #the second one there is the action
print(analyze)

The action DeepFace.analyze class in the above code contains the parameter actions, it specifies what facial attribute analysis we want it to perform. DeepFace currently supports 4 actions which include age, gender, emotion, and race. But for our project, we’ll only be using emotional action. 

You can run the code by just running the file name in its location in the terminal. After running the above code you should get results that would look something like this:-

{'emotion': {'angry': 0.2837756188594808, 'disgust': 2.789757723035734e-07, 'fear': 0.456878175451973, "happy': 92.482545707485, 'sad': 0.152323190865646454,
surprise': 1.9998317176006223, 'neutral': 0.0084371718453264), 'dominant_emotion': 'happy', 'region': {'X': 1910, 'y': 878, 'w': 1820, 'h': 1820}}
[Finished in 13.48s]

You can see here all the facial data that deepface was able to collect. The emotion values it shows are ‘angry’, ‘disgust’, ‘fear’, ‘happy’, ‘sad’, ‘surprise’, and ‘neutral’. With that, it also shows the dominant emotion, which is ‘happy’ in our case, and the region on the photo where the face is, it comes in handy when we need to make something like a box around the face to showcase it.

We can also only get the most expressed emotion very easily by using the dominant_emotion attribute in the print function.

print(analyze['dominanmt_emotion'])  #add this instead of the already written print function 

After adding the given attribute run the code similarly as described above. You should get a result like this:

happy
[Finished in 10.08s]

And if you are wondering why we got happy as the result. It is because happy got the highest numeric value. You can see it in the previous result.

Get emotions from a face through a webcam

Now that we know how this functions we now can make it work with a webcam or cameras to detect faces and their emotions. We’ll now be using the extra packages that we imported earlier and making things to make our previous code work more efficiently.

import cv2
from deepface import DeepFace
import numpy as np

face_cascade_name = cv2.data.haarcascades + 'haarcascade_frontalface_alt.xml'  #getting a haarcascade xml file
face_cascade = cv2.CascadeClassifier()  #processing it for our project
if not face_cascade.load(cv2.samples.findFile(face_cascade_name)):  #adding a fallback event
    print("Error loading xml file")

video=cv2.VideoCapture(0)  #requisting the input from the webcam or camera

while video.idOpened():  #checking if are getting video feed and using it
    _,frame = video.read()

    gray=cv2.cvtColor(frame,cv2.COLOR_BGR2GRAY)  #changing the video to grayscale to make the face analisis work properly
    face=face_cascade.detectMultiScale(gray,scaleFactor=1.1,minNeighbors=5)

    for x,y,w,h in face:
      img=cv2.rectangle(frame,(x,y),(x+w,y+h),(0,0,255),1)  #making a recentangle to show up and detect the face and setting it position and colour
   
      #making a try and except condition in case of any errors
      try:
          analyze = DeepFace.analyze(frame,actions=['emotions'])  #same thing is happing here as the previous example, we are using the analyze class from deepface and using ‘frame’ as input
          print(analyze['dominant_emotion'])  #here we will only go print out the dominant emotion also explained in the previous example
      except:
          print("no face")

      #this is the part where we display the output to the user
      cv2.imshow('video', frame)
      key=cv2.waitKey(1) 
      if key==ord('q'):   # here we are specifying the key which will stop the loop and stop all the processes going
        break
    video.release()

Output

You can run it the way you did before, by running the file name in the terminal. After you have started it will open your webcam window and you should be able to see a red box on your face which is it, trying to detect your face.

You will also see it detecting and printing out your face’s dominant emotions on the terminal which would seem something like this:

neutral
neutral
neutral
neutral
happy
happy
neutral

It will constantly output the detections unless you stop it, we have also set the stop operations key which was ‘q’ you could press it to instantly stop all processes at once.

And that’s it! We have created a very simple app that can perform emotion detection using python. It was a very basic project but can help you grow your skills quite a lot there’s plenty of room for improvement and experimentation.

Here are some useful tutorials that you can read: