AIAAIC's "Ethical Issue Taxonomy" describes topics of concern posed by the inappropriate, unethical or illegal use of an AI, algorithmic and automation system, or set of systems, and/or it's governance.
Applied to entries to the AIAAIC Repository, the taxonomy aims to present ethical concerns about these technologies in a clear and succinct manner that is understandable to people using and directly or indirectly impacted by these systems.
It is primarily intended to help researchers, civil society organisations, journalists, students, users and others to identify and understand the ethical dimensions and challenges posed by AI, algorithmic and automation systems, and to resist and hold them accountable where warranted.
The Ethical Issue Taxonomy is available to third-parties to download, comment upon, update, and re-use in line with our terms of use.
Accessibility. The ability/inability of the disabled, elderly, non-internet users and other disadvantaged and vulnerable people, to access and engage with a system at any time, without delay or downtime.
Accountability. The ability/inability of users, researchers, lawyers and others to submit complaints and appeals, and to meaningfully investigate, evaluate, and hold the individuals and entities legally responsible and liable for its impacts.
Accuracy/reliability. The extent to which a system behaves dependably, accurately and consistently in the situation for which it is designed, or leads to low quality, inappropriate or harmful decisions, especially when used in high-stakes environments such as healthcare, transportation, policing, or finance.
Alignment. The extent to which a system, including its objectives and incentivisation, is seen to be in line with human values, ethics and needs, and is considered suitable and acceptable for the specific context and purpose in which it is deployed.
Anthropomorphism. The attribution of human traits, emotions, intentions, or behaviours to non-human entities such as AI and robotics systems by system designers, developers and operators, and/or by their users and others.
Appropriation. The use of the cultural, intellectual, or symbolic information or works belonging to or associated with an individual or community, without acknowledgement or permission.
Authenticity/integrity. The design, development and use of a system in a genuine and true manner, as opposed to deception, falsification, plagiarisation, misrepresentation and other potentially harmful uses.
Automation bias. The excessive trust and reliance on automated systems and decision support tools, often favouring their suggestions even when more accurate contradictory information is available from other sources.
Autonomous weapons. The design, development, and deployment of lethal autonomous weapons systems that can select and engage military and other human and non-human targets with little or no human control.
Autonomy/agency. The inability of a system's users or people impacted by it to maintain control over their own decisions and independence.
Competition/monopolisation. The use or misuse of a system that results in the actual or potential distortion of market dynamics, entrenchment of dominant players, or undermining of fair competition.
Dual use. The design and development of a system for multiple purposes, or its misuse for purposes beyond its original stated purpose, including military and criminal use.
Employment/labour. The use or misuse of systems that replace human jobs or change working conditions in ways that create job loss, insecurity, or inequality.
Environment. The development, deployment, or operation of a system in such a way that it damages the environment through excessive energy consumption, resource depletion, pollution, or other actions.
Fairness. The creation or amplification of unfair, prejudiced or discriminatory results due to biased data, poor governance or other factors.
Human rights/civil liberties. The use or misuse of a system to directly or indirectly erode or impair the human rights and civil freedoms of a user, group of users, or others.
Mis/disinformation. The use or misuse of a system to create and/or share information and data that deceives - accidentally or deliberately - the general public and others.
Normalisation. The process - declared or otherwise - by which a system, or set of systems, shifts from being viewed as novel, disruptive, or ethically questionable to being seen as standard, routine, or essential. Also, the use of the system to normalise inappropriate or unethical beliefs or behaviours.
Privacy/surveillance. The violation of personal privacy caused by the use or misuse of a system or set of systems, including mass surveillance systems.
Representation. The use or misuse of a system to portray an individual, group, or idea is portrayed in a manner that is misleading or untrue, and results in unfairness, harm, injustice, or the denial of dignity to the represented entity.
Revisionism. The use or misuse of a system to change an established or accepted doctrine, policy, or historical view.
Robot rights. The granting of protections and moral considerations - such as fair treatment or limits on harm - to robots rights based on their capabilities or perceived autonomy, thereby defining how they should behave in relation to humans, and how humans should treat robots.
Safety. The physical, psychological and mental safety of users, animals and property posed by the use or misuse of a system and its governance.
Security. The protection of a system from breaches, leaks or unauthorised use in order to maintain the privacy and confidentiality of data and information.
Transparency. The degree and manner in which a system and its governance, including its purpose, inner workings and known risks and impacts are clearly and accurately described and understandable to users, the general public, policymakers, and other stakeholders, as opposed to communicated in a misleading, partial or otherwise opaque manner.
December 5, 2025: Added "Appropriation" and "Normalisation"; Replaced "Bias/discrimination" with "Fairness"; Added "Revisionism"; Removed "Liability"; Renamed "Human/civil rights" as "Human rights/civil liberties"; Updated "Accountability" definition to include reference to liability