A research paper titled “Empa Talk: A Physiological Data Incorporated Human-Computer Interactions” has been accepted to CHI2014 Working Progress Session.
We present a novel approach that allows the user to feel another person’s emotional status while communicating with each other in a video chat. The video chat is composed of physiological sensors and multimodal displays. In our first prototype, we employed a Galvanic Skin Response (GSR) sensor and a Blood Volume Pulse (BVP) sensor as these are crucial indications to human emotions. A vibrotactile motor and a RGB LED were also used in order to convey and display the other’s emotion on one’s wrist. Along with the hardware part, we implemented intuitive software for processing, transmitting, and displaying bio feedback data.