Undergrad Research Project - Visualizations of Computer Perception

Spring 2017

Student
Steffen Holm
Advisor
Austin Lee
Project description

Computers have recently been gaining human level perception of the world thanks to breakthroughs in computer vision and deep learning. Research by Nguyen et. al. has shown that although neural networks have shown human level performance in tasks like image classification, they can be fooled by specially patterned images which humans would never misclassify. This result implies that although computers can perceive the world they do not always understand the world.

This research project seeks to visualize such computer perception processes in an attempt to both improve our understanding of these systems and improve the process by which we evaluate this understanding. One process to be investigated is when a convolutional neural network learns kernels for face detection. Traditional methods of evaluation are based in the accuracy of the end classification. By visualizing these kernels as they are being learned we can create novel methods for evaluating the true understanding inherent in these models. In essence we can watch a computer learn to see a face in real time and draw conclusions from its learning process. Additional mediums of computer perception will be explored - such as music. How do modern music visualization techniques seek to understand the underlying dynamics of a song? If we can create a more aesthetically pleasing visualization, does this necessarily reflect an improved level of computer perception? Technologies that will be used in this project include WebGL, OpenFrameworks, Affectiva SDK, Clubber.js, and other libraries as needed.

The end result will be a standalone web demo and if time permits, an external exhibit that would allow for human interaction.

Return to project list