Graphic Language for Emotion-Based Interface#TouchDesigner #TeachableMachine #ComputerVision #Emotion #Synesthesia #ComputerGraphic
Aside from its scientific value, which depends upon an exact examination of the individual art elements, the analysis of the art elements forms a bridge to the inner pulsation of a work of art.
— Wassily Kandinsky, POINT AND LINE TO PLANE
I created a dataset consisting of images of facial expressions displaying 7 distinct emotions (anger, contempt, disgust, fear, happiness, sadness, surprise) and trained machine learning models to recognize these images using Teachable Machine. Next, the trained models were plugged into TouchDesigner to create this emotion-based interface, which responds to various emotions in real time with visual elements such as colors, shapes, spatiality, and motion. I assigned specific graphic and animation elements to each facial expression to convey emotions. The way these elements are selected and configured determines the visual appearances and emotional expressions of graphic outcomes. The parameters of these elements are partially based on the theory of Kandinsky, whose synesthesia (a neurological condition in which hearing music caused him to see colors) heavily influenced his paintings. Like Kandinsky’s synesthesia, this emotion-based interface connects users’ emotion to animated computer graphics, working as a synesthetic channel bridging different senses and experiences.
Animation
Happiness
Sadness
Anger
Disgust
Fear
Surprise
Contempt
Training Teachable Machine
Visualizing Seven Emotions
AngerSadnessFearContemptSurpriseHappinessDisgust