Description: The facial expression recognition tools are trained and evaluated on benchmark datasets that contain many expressions generated 'at the request' of the expressor and photographed en face. This does not match the reality, where expressions are not so strong and where the face is not always facing the camera. The project should: (a) identify a catalogue of situations that may frequently arise when interacting with camera-based systems (e.g., tilt/turn of the head, various quality and resolution of the image), (b) prepare a database of images expressing facial expressions in a natural (non-forced) way in different situations, © identify existing APIs for recognising emotions from facial expressions, (d) evaluate the found APIs on the prepared set, (e) summarise the results by indicating the strengths and weaknesses (supported situations) for each
API.