For human-robot interaction it is incredibly important to detect the face of the persons interacting with the robot. This can help the robot to detect the subject that is speaking to it and/or he emotions of the subjects interacting with it. Event cameras promise high temporal resolution (and low latency) tracking of facial expressions, however, the task is more complex when the subject does not move as it is not easy to detect the face features (eyes, nose, mouth...) in order to localize the face.
To this aim, the temporal pattern of the eye blinking can be used to detect and localize the eyes and based on the distance between the eyes we can also detect the face. Although there are works that have made use of eye blinks to detect and track the face, we are working on a method that is robust to different light conditions (artificial and natural) and scales.