ALGORITHMS AS LEGAL DECISIONS: GENDER GAPS AND CANADIAN EMPLOYMENT LAW IN THE 21st CENTURY.

AuthorNiblett, Anthony

Introduction

Should judges and arbitrators in Canada use algorithms to assist with their decision making? Could we ever replace the decisions of judges with the assessments of an algorithm? Some legal scholars and futurists have posited the idea that artificially intelligent algorithms could form the basis of legal decisions. (1) This is not merely an issue for the future. Predictive algorithms are already used by lawyers to assist with dispute resolution. (2) And artificially intelligent tools are being used as a basis for legal decisions in China, (3) Estonia, (4) and other jurisdictions. (5) The question of how much authority the Canadian legal system decides to delegate to algorithms is, therefore, one of paramount importance in the 21st century.

The use of such algorithmic tools to help make decisions raises a number of potential concerns. Algorithmic bias is frequently high on the list of concerns and examples of such bias are plentiful. In the legal context, studies have shown that racial bias infects algorithms used by judges to assess flight risk in bail decisions or risk of recidivism in sentencing hearings. (6) These algorithms have been found to assess black defendants more harshly than white defendants, even when race is not one of the variables explicitly considered by the algorithm. (7) In other contexts, Amazon recently built an artificially intelligent tool to help with hiring employees, but shut it down after it was discovered to be discriminating against women. (8) The prevalence of such biases should caution us against using algorithms in decision making. This would appear to be especially true in the legal context, where decisions can come at the expense of life and liberty.

In this paper, I explore the potential to use one particular type of predictive algorithm in legal decision making. The type of algorithm examined here predicts the "most likely" outcome if a case were to go to court. An algorithm in this mould seeks to predict what would happen if a judge were to decide on the case. The algorithm relies on data generated from previous judicial decisions. It presupposes that the prior decisions of judges provide a good basis for making a prediction in future cases. If the algorithm were to be used by a judge, then the judge is essentially saying that she agrees with the law as decided in previous cases.

Here, I investigate potential bias in these types of algorithms. I focus on one specific legal issue where such algorithms could conceivably be used by judges to assist with decision making in the near future. The legal issue I explore is: what is a reasonable notice period to be awarded to an employee who has been dismissed. Under Canadian employment law, there is an implied term in an employment contract that upon dismissal without cause, an employee is entitled to a reasonable notice period, or a payment in lieu of such a notice period. What is reasonable will depend on the circumstances. It will depend on the age of the employee, how long they have worked with the employer, the type of job the employee had, what opportunities for similar employment exist, amongst other factors. This can be a difficult exercise for judges and arbitrators. How are they to weigh up all these different factors to arrive at what is reasonable?

Judges and arbitrators frequently look to past cases for assistance in determining the length of a reasonable notice period. They look at prior decisions to see how previous judges and arbitrators have weighed the factors, and--for the most part--try to come up with a reasonable notice period that is in line with the prior law. This, too, is what the predictive algorithm does. The predictive algorithm uses the data describing relevant precedents and provides a best guess about how a new case would fit into the existing body of case law.

But what if the body of case law is riddled with bias? If judges were to use predictive algorithms that replicate the existing law, then any biases currently found in the case law will merely be reinforced. Thus, it is imperative to ensure that the data upon which the predictive algorithm is based--i.e., existing case law--not only reflects the objectives of the law, but also is free from harmful bias that may entrench discriminatory outcomes.

It is thus necessary to ask whether the existing case law does indeed contain biases. Here, I focus on gender bias. I focus on gender bias for two reasons. First, it is relatively easy to determine the gender of plaintiffs in past cases (at least, it is much easier than the determining other characteristics, such as race). Second, there are studies that show there is evidence that legal decisions in reasonable notice cases contain elements of gender bias. Professor Kenneth Thornicroft, for example, has shown that female plaintiffs are awarded lower reasonable notice awards than male plaintiffs. (9) If women receive lower damages then men for wrongful dismissal, holding all other variables constant, then one questions whether women and men are treated equally under the law. It follows that any algorithm using past decisions as the foundation for future decisions will merely perpetuate the gender bias. This, clearly, would be concerning and completely at odds with the objectives of the law.

Here, I re-examine the statistical evidence that reasonable notice awards in Canada reflect gender bias. I take advantage of the fact that there have been many more judicial decisions since Professor Thornicroft's studies. I also use data from all published decisions, whether they be from arbitrators or judges, over the period 1997 to 2019. My data describe 1,728 legal decisions, over ten times the size of Professor Thornicroft's dataset. (10) Further, my data are more refined, with more variables of interest that enable more detailed description and explanation of the content of the existing law. I perform simple statistical tests to determine whether or not the existing case law reflect a gender gap in reasonable notice awards. If there is correlation between gender and the outcome, holding all other variables of relevance constant, then it would suggest that there may be differences in the way that female and male employees are treated by the legal system.

In short, I find no direct evidence of a gender gap in the awards of reasonable notice. In these 1,728 cases, there is no statistically significant correlation between the notice periods awarded to female plaintiffs and to male plaintiffs once all other relevant factors are held constant. This is not to dispute the findings of Professor Thornicroft's study. On the contrary, I am for the most part able to replicate Thornicroft's results in the subset of cases he examines and using his methodology. The broader point though is that when all available data are used over a longer period with more refined analysis, direct evidence of differential treatment vanishes.

While these results appear promising, they cannot be the end of the story. Gender differences in the law of reasonable notice can--and do--emerge through other channels, such as job type or compensation. (11) Both the type of job and the level of compensation are correlated with judicial outcomes in my dataset; they are also correlated with gender. For example, the data show that clerical workers receive shorter reasonable notice awards than other workers, such as those in management. In the dataset, clerical workers are disproportionately female and management disproportionately male. Thus, gender biases may be baked into the legal test. (12) Further, compensation is positively correlated with the outcome. In the dataset of cases, female plaintiffs receive lower compensation than their male counterparts. To the extent that compensation is correlated with both gender and the legal outcome, then it might be that female plaintiffs receive shorter reasonable notice period awards because they earn less. This has the effect of compounding any gender wage gap, since the final damages awarded to plaintiffs is the multiple of the reasonable notice award and the plaintiffs compensation. (13)

This paper has two parts. The paper explores one broad issue (should we use algorithms as legal decisions?) by focusing on a narrower one (is there statistical evidence of gender bias in Canadian employment law?). The answer to the narrower question helps inform our views on the broader question. I answer the narrower question first.

In Part 1, I re-examine the statistical evidence on bias against female plaintiffs in reasonable notice awards. I show that the data do not bear out any direct evidence of gender bias in the case law. I show that there are, however, other channels through which gender bias has subsisted. In Part 2, I explore the potential for judges to use algorithmic predictions in the decision-making process, arguing that there may be still be concerns about bias, depending on how the algorithm is implemented. In short, great care needs to be taken to ensure that the algorithm does not reinforce hidden biases in the law. I further explore the possibility of alternative types of algorithms that do not rely on judicial decisions as data. A final part concludes.

  1. Gender bias and reasonable notice awards

    In this Part of the paper, I examine the evidence as to whether the existing case law on reasonable notice awards reflects gender bias against female plaintiffs. First, I will provide a short background of the legal issue. Next, I discuss the prior literature on gender differences in this area of law. I will describe the dataset of 1,728 cases from 1997 to 2019 and discuss my results. In simple linear regression tests, there is no direct evidence of gender bias. But I also explore other channels through which female plaintiffs may have been treated unfavourably.

    1.1 The legal background

    Under Canadian employment law, workers dismissed without cause are entitled to a reasonable notice period...

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT