Informa Australia is part of the Informa Connect Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 3099067.

Social Policy | Technology

Can sentencing be enhanced with artificial intelligence?

29 May 2018, by Amy Sarcevic

Since the 17th century, judicial decisions have been handed down by human beings, for want of a better alternative.

But oddly the arrival of a better alternative –artificial intelligence (AI) – has been met with doubt, concern and, in some cases, resistance.

“There’s something distinctly uncomfortable and counter intuitive about letting a robot decide the fate of a human being”, says Professor Dan Hunter of Swinburne University. “But using artificial intelligence to make judicial decisions can actually lead to more accurate sentencing”.

Research has shown that the human brain is highly fallible and susceptible to ‘heuristics’ and ‘biases’ when making judgments, which compromises the integrity of judicial decisions.

‘Heuristics’ are mental shortcuts and rules that have been hard coded by evolution. They are designed to make the human brain work more efficiently but can often lead to psychological ‘biases’ – that is, errors in perception, reasoning or memory, that result in inaccurate judgments or decisions.

Extensive research has shown, for example, that people draw different conclusions from the same information, based on how that information is presented to them; a phenomenon known as ‘the framing effect’.

They are also more likely to recall negative information than positive information (‘negativity bias’), more likely to react strongly when a single identifiable person is at risk, as opposed to a large group of people (‘identifiable victim effect’) and more likely to believe a statement to be true if it is easier to process or has been stated multiple times (‘illusory truth effect’). Not to mention the powerful effects of racial, age, gender and socioeconomic stereotyping.

On top of this, research shows that people tend to have an overconfidence in their own decisions. One experiment indicates that people who rate their answers as being 99% certain, turn out to be incorrect, on average, 40% of the time.

These phenomena have led to differences in the way sentences are administered across the judicial system.

“Punishments should fit crimes with consistency”, says Professor Hunter. “Through the use of AI, we can administer sentences in a logical and algorithmic manner, without being liable to the types of biases that affect human decision making”.

Professor Dan Hunter is the founding Dean of Swinburne Law School; and an international expert in cognitive science models of law.

He will share further insights into this revolutionary AI technology and how it will impact the judicial system, both practically and ethically, at the Prisons 2018 Conference – due to take place 2-3 August in Melbourne.

Learn more and register.

Blog insights you may like

Get all the latest on Informa news and events

Informa Connect Australia is the nation's leading event organiser. Our events comprise of large scale exhibitions, industry conferences and highly specialised corporate training.

Find out more

Subscribe to Insights
SUBSCRIBE 

Join Our Newsletter
Informa Insights

Stay up-to-date with all the latest
updates, upcoming events & more.
close-link