Class Year




Document Type


Degree Name

Bachelor of Arts (BA)

Department or Program

Mathematics and Computer Science

Second Department or Program


First Advisor

Sugata Banerji

Second Advisor

Matthew R. Kelley

Third Advisor

Craig D. Knuckles


Recognizing human emotions based on a person's body language is a complex neurological process that humans often take for granted. The visual pathway that propagates from the eye to the occipital lobe, breaking down images from basic features and building upon the previous layer, can be replicated in a computer using artificially intelligent algorithms in the field of computer vision using convolutional neural networks (CNN). The neural network, just like in the brain, models the interaction between neurons, with each neuron being represented by a mathematical function called a perceptron. Real-world images of humans in varying environments displaying emotion through variations in expression, body language, colorings, and features are manually labeled and used to train the CNN. This allows the algorithm to pick out features that occur in pictures of each emotion that will intelligently aid in relating body language, expressions and poses to an emotion, even if this relation is not defined previously. The accuracy of the classification is then analyzed by reviewing what features were extracted from the images and the network is retrained accordingly to output even more accurate results.