Faception
Faception
Report incident 🔥 | Improve page 💁 | Access database 🔢
Faception is a tool that uses computer vision and machine learning to analyse facial images, profile people and make real-time predictions about their personality, behaviour and character traits.
The technology is able to "analyze faces from video streams (recorded and live), cameras, or online/offline databases, encode the faces in proprietary image descriptors and match an individual with various personality traits and types with a high level of accuracy," according to its developers of the same name.
Faception says it has developed 15 “classifiers,” each of which describes a certain personality type or trait - including Academic researcher, Bingo player, Terrorist and Paedophile.
The company claims its technology is used in public places such as airports, train stations, government, public buildings, and border control areas.
Website: Faception 🔗
Released: 2016
Developer: Faception
Purpose: Identify personality type; Predict behaviour
Type: Facial personality analysis
Technique: Computer vision; Behavioural analysis; Emotion recognition; Facial recognition; Personality analysis; Machine learning
Faception is seen to suffer from several important transparency and accountability issues:
Algorithmic decision-making. Faception's algorithm is proprietary, with the company refusing to disclose the methods used to analyse facial features and make predictions.
Privacy. Faception collects and analyses sensitive biometric data, including facial images. However, the company's data protection policies and procedures are not publicly accessible or transparent, and it is unclear how this sensitive information is stored, shared or protected.
Deployment. Faception refuses to say how its technology is used in various contexts, including law enforcement, border control and employment screening, making it difficult to assess the potential risks and benefits of its technology.
Errors or misclassifications. Faception's technology is liable to errors or misclassifications, but the company has failed to establish clear procedures for addressing these kinds of errors or to provide recourse for those wrongly affected.
Regulation and oversight. Faception operates in a regulatory grey area, and there is limited oversight of its activities. This lack of regulation can lead to unchecked use of the technology, potentially resulting in harm to individuals or groups.
Faception's claims to predict complex personality traits and behaviours such as identifying terrorists or paedophiles has been criticised for potentially leading to false positives and unjust profiling.
Specifically, the company's claims about the accuracy of its technology are based on largely unproven scientific theories about the relationship between facial features and personality traits.
Furthermore, its technology appears to be based on Western cultural and social norms, which may not be applicable in other cultural contexts, and it seems not have have demonstrated an understanding of the potential cultural and social implications of its technology.
Page info
Type: System
Published: March 2023
Last updated: April 2025