Engendered biometrics
Computerised gender recognition could be useful in physiological and psychological analysis as well as in security applications. However, there are difficulties in accurately classifying a human face as being of one particular gender, particular with reference to shape, facial features and behaviour. Researchers from the US and The Netherlands have reviewed the state of the art in gender recognition software. Despite the obstacles, the team found that biological information using physiological signals is not easily confused in its classification and has a higher accuracy (approximately 92 to 100%) compared with gender classification methods that do not use this approach. Biometrics are 80 to 99% accurate, less invasive and are not confused by mood nor clothing. Gender classification based on social network behaviour is least accurate (67 to 88.6%). “A single approach cannot satisfy all the gender classification requirements in various conditions, and each gender classification approach is suitable in a particular field according to the characteristic of performance,” the team reports.
Lin, F., Wu, Y., Zhuang, Y., Long, X. and Xu, W. (2016) ‘Human gender classification: a review’, Int. J. Biometrics, Vol. 8, Nos. 3/4, pp.275–300.
The morning sun really shows your age
The pattern of wrinkles in the aging human face is as unique as a fingerprint and as such might be useful in age-based intelligent systems for the applications such as biometrics, security and surveillance, according to researchers in India. The team has developed a novel human age classification system based on local binary patterns and wrinkle analysis for ageing feature extraction and multi-class support vector machine for age classification to classify the face images into four age classes. They report an accuracy in age classification of more than 91% even with “noisy” images of the human face when they apply their “salt-and-pepper” noise filter, although there is room for improvement in terms of extraction after filtering for Gaussian noise.
Jagtap, J. and Kokare, M. (2017) ‘Local binary patterns and wrinkle analysis in combination with multi-class support vector machine for human age classification’, Int. J. Applied Pattern Recognition, Vol. 4, No. 1, pp.1–13.
Twitter complexity
Researchers in Germany explain that almost every analysis of social media and social networking sites inevitably fails to take into account the complexity of human beings when investigating the properties of systems such as the microblogging platform, Twitter. They have now demonstrated that behaviour is a key feature in detecting users with specific characteristics. Their research demonstrates how meaningful patterns of user behaviour can be extracted on a large-scale that reflects the personalities of those users. “This,” the team suggests, “is a first step to prediction of user action and the underlying individual decision-making process.” Their approach identifies clusters methodically but allows for automated detection of user behaviour patterns.
Klotz, C., Akinalp, C. and Unger, H. (2016) ‘Clustering user behaviour patterns on Twitter’, Int. J. Social Network Mining, Vol. 2, No. 3, pp.203–223.
Virtual security
A virtual machine (VM) emulates the behaviour of a computer system. VMs are based on computer architecture and mimic the functionality of a physical computer. They are widely used in a range of settings not least cloud computing, to allow users to run operating systems meant for one kind of computer architecture on another, and to allow several systems to run on a single server independently. US researchers have looked at the security of VMs in the context of cloud computing and suggest that several bad practices ought to be corrected in order to tighten security in many systems. Risk-management plans and appropriate auditing represent the baseline if providers are to protect their systems from intruders, malware and other computer security problems.
Khan, A. (2017) ‘Virtual machine security’, Int. J. Information and Computer Security, Vol. 9, Nos. 1/2, pp.49–84.