February 15, 2022

In 2019, Google carried out a study of gender discrimination in their organization. Shockingly, they found that men were actually paid less than women and increased men’s pay as a result. The US Dept. of Labor looked at the same study and disagreed, finding that (as perhaps expected) men were paid more than women. Who was right and why?

To answer this question, we need to understand Simpson’s Paradox and use the power of causality to consider the study’s findings more carefully.

Google’s Analysis

To understand Google’s approach to this problem, let’s use the following (made up) data:

GENDER FEMALE MALE
NON-MANAGEMENT $3163 (87) $3015 (59)
MANAGEMENT $5592 (13) $5320 (41)

The number of people in each category is shown in parentheses.

You can see that within both the non-management and management categories, women are in fact paid more than men on average, so it seems Google may have been correct.

Let’s do the math:

Non-Management gender gap = $3163 – $3015 = $148 in favor of women
Management gender gap = $5592 – $5320 = $272 in favor of women

We also need to take into account the number of people in each category to find the average gender pay gap:

Non-management employees = 87 + 59 = 146
Management employees = 13 + 41 = 54
Total employees = 146 + 54 = 200

Adjusted gender pay gap = (146/200 * $148) + (54/200 * $272) = $181.48

So, by conditioning on job position in the same way as Google, we find that, on average, women are paid $181.48 more than men.

But is that correct?

Simpson’s Paradox

Instead of conditioning on job position, let’s condition on gender. To do so, we need to calculate the average pay in each job category separately for women and men:

Total women = 87 + 13 = 100
Total men = 59 + 41 = 100

Average pay for women = (87/100 x $3163) + (13/100 x $5592) = $3478.77
Average pay for men = (59/100 x $3015) + (41/100 x $5320) = $3960.05

Using this approach, we can see that men are paid $481.28 more than women on average, which is the opposite of Google’s findings!

This is an example of Simpson’s Paradox, whereby the results of a statistical analysis can flip 180 degrees, simply by changing the way we calculate averages.

For another example, let’s take a look at batting averages for Derek Jeter and David Justice in 1995 and 1996.

Jeter 1995 = 12/48 = .250
Jeter 1996 = 183/582 = .314
Jeter 95 & 96 = 195/630 = .310

Justice 1995 = 104/411 = .253
Justice 1996 = 45/140 = .321
Justice 95 & 96 = 149/551 = .270

The best results are in bold. We can see that Justice had better batting averages in both 1995 & 1996, but that Jeter had the better average when we combine the two years. Weird right?

In this case, Simpson’s Paradox comes about because of the different number of at bats each player had in the different years.

Despite being known for 70 years (it was first described by Edward H Simpson in 1951), Simpson’s Paradox crops up very often in everyday statistical analysis and is one of the major reasons why we say “lies, damn lies and statistics”, since it can lead to a lot of invalid conclusions.

Simpson’s Paradox and Causality

Let’s return to the Google example and see how a causal approach can help resolve Simpson’s Paradox.

If we draw a simple causal model of the Google problem before looking at the data, we can see that gender discrimination influences both salary difference AND promotion to management:



Because gender causes both promotions and salary difference, we need to condition on gender when analyzing the data. It’s now obvious that Simpson’s Paradox led Google to come to the wrong conclusion.

To illustrate this further, let’s now use exactly the same data, but with gender switched for lifestyle:

LIFESTYLE HEALTHY UNHEALTHY
NON-MANAGEMENT $3163 (87) $3015 (59)
MANAGEMENT $5592 (13) $5320 (41)

In this case, a reasonable causal model would show that being in management is likely to cause both a higher salary and a healthier lifestyle (more opportunities to play golf?):



In this case, we need to condition on management in order to do a valid analysis, which is the opposite of what we found in the Google example – despite the actual data being the same in both cases!

Summary

We can see that causal analysis helps us to resolve common problems such as Simpson’s Paradox. We can also see that it requires us to think about the problem before looking at the data!

This need to actually understand the problem before diving into data science is at the heart of Geminos’ approach to building AI driven solutions. It simply isn’t good enough to run algorithms on whatever data we can find and look for patterns – we need to understand the underlying causes and the effects they trigger. Only then can we start looking for the data and algorithms that help us solve the problem at hand.

Acknowledgements

The excellent YouTube course on Causality by Paul Hünermund was the source of the Google example and can be found at youtube.
The baseball batting averages are from the Wikipedia definition of Simpson’s Paradox.
Prof. Judea Pearl’s “The Book of Why” also discusses Simpson’s Paradox in detail.