The Sign Language Technologies Team constantly provides the Greek and international scientific community with SL data for research and education in the fields of SL technologies and human-robot interaction (HRI).

Technology transfer started 2006 with an innovative approach to robot tele-operation using a set of Greek Sign Language (GSL) handshapes, in the framework of research on  visual analysis and recognition of gestures with application in robot tele-operation as part of DIANOEMA project.

In the framework of ΜΟΒΟΤ (FP7) project, creation and annotation of the project multisensory – multimodal dataset as well as the definition of a multimodal human-robot communication model were both based on the group’s expertise in the collection of sign language data and the semantic analysis of a set of basic signs of GSL.

In the framework of COGNIMUSE Multimodal Signal and Event Processing In Perception and Cognition, previous research on sign language semantics led to researching the semantically intensive characteristics of embodied communication, which are common to oral and sign language productions.