This page is dedicated to document my personal work on the larger project Sentiment Voice: Integrating Emotion AI & VR, a Virtual Reality experience that integrates user emotions into an adaptive environment, aimed to shed light on emotional data tracking for public awareness and engagement. The project is a collaborative effort among several engineers and artists.
"And while there is an enormous structural power asymmetry between the surveillers and surveilled, neither are those with the greatest power free from being haunted by a very particular kind of data anxiety: that no matter how much data they have, it is always incomplete, and the sheer volume can overwhelm the critical signals in a fog of possible correlations."
Kate Crawford
The Emotional Analysis project brings us to an interesting cross road of data biases, machine-learning, and virtual reality. The user enters an inviting stylized, low polygon city. Environmental elements start to alter slightly initially just from the users face resting in the headset. The wearer has been prompted with some questions like "Talk about how the environment makes you feel". After describing some element of the world - the cafe at the bottom of a skyscraper or a large truck passing by - the weather, car densities and speeds, or lighting starts to change. The users face expressions and sentiment analysis appear on the data visualization board adjacent to the streaming of the headset. The user will respond with some comment like "How can it understand what I'm feeling" and the world will start shifting towards a gray-ish setting with cars disapearing and reappearing dynamically. We do our best to answer these sort of questions but the layers of abstraction of our own creation prohibits us from truly knowing.
I'm proud to announce that the project won 1st place at my university's Senior Design Expo for the Computer Science Department.