Lip-password, developed by Prof Cheung Yiu-ming from the Department of Computer Science, is a patented double security system to verify a person's identity by simultaneously matching the password content with the underlying behavioural characteristics of a person's lip movement. Prof Cheung adopted a computational learning model which extracts the visual features of lip shape, texture and movement to characterise lip sequence.
The user just simply has to utter a short phrase of their choice of password. Unlike voice recognition systems, the user is not required to speak aloud so that there are no language restrictions and people with a speech disability can set a password. Moreover, the spoken password can be reset at any time to strengthen the security or not limited to the age-related facial changes, giving the lip-password system an advantage over other typical biometric recognition systems.
The potential applications of this new patented technology include keeping personal data private on smartphones and computers, gaining access to buildings and vehicles, facilitating immigration clearance as well as securing financial transactions. The invention was awarded a gold medal with distinction and an Award of Excellence from Romania at the 46th International Exhibition of Inventions, Geneva.