LinkedIn’s Algorithm Under Scrutiny for Gender Bias

LinkedIn’s algorithm is under attack once again after being accused of bias against women, sparking larger conversations about equity in online spaces. Recent experiments reveal that changing profile gender can significantly influence user visibility, with some individuals experiencing dramatic increases in engagement when presenting as male. So Michelle changed her account’s profile from female to…

Lisa Wong Avatar

By

LinkedIn’s Algorithm Under Scrutiny for Gender Bias

LinkedIn’s algorithm is under attack once again after being accused of bias against women, sparking larger conversations about equity in online spaces. Recent experiments reveal that changing profile gender can significantly influence user visibility, with some individuals experiencing dramatic increases in engagement when presenting as male.

So Michelle changed her account’s profile from female to male. In only one day’s time, she experienced an impressive 238% growth in impressions! Yet this concerning contrast begs the question as to why and what mechanisms or biases are built into LinkedIn’s algorithm.

As the professional networking platform continues to grow and develop, LinkedIn has implemented Large Language Models (LLMs) to personalize and tailor content to its users. These models may replicate human biases, including sexism and racism. This leads to some justifiable questions about their impact on user experience. In light of researchers’ findings of such biases housed within popular LLMs, governments, advocacy organizations, and academics have called for the platform to be held more accountable.

Experiment Highlights Gender Disparities

Cindy Gallop and Jane Evans experimented to see how the algorithm was responsive to gender through #WearthePants. Gallop’s findings were striking: her post reached only 801 individuals, while a male counterpart sharing the same content garnered an audience of 10,408.

These findings highlight the stark disadvantages to women when participating on LinkedIn. As Chad Johnson noted, “It cares whether your writing shows understanding, clarity, and value.” Elements like a user’s gender identity seem to subvert this fastidious evaluation process, leading to decreased visibility for women users.

Furthermore, the algorithm’s optimization is based on user engagement metrics, including clicks and saves and interaction with the content. With such continuously changing member behavior, the algorithm is changing by the minute as users respond to content. That’s a feedback loop. Visibility isn’t just a matter of good content—it’s about the gendered perceptions lurking in our community.

User Experiences Reflect Algorithmic Bias

Many other users have shared stories like Michelle’s when they switched their profile gender to male. Like Megan Cornish and Rosie Taylor before them, both gained a broader platform after making the transition. This remarkable phenomenon raises important ethical and critical questions about how LinkedIn’s algorithm is judging content according to assumed gender.

Shailvi Wakhulu commented on the adverse effects of these biases on content creators: “It’s demotivating for content creators with a large loyal following.” Her concern is an echo of overall worries about the ways that the algorithmic bias can sabotage engagement efforts toward these historically excluded communities.

Brandeis Marshall emphasized the implications of demographic influences: “If Black women only get interactions when they talk about black women but not when they talk about their particular expertise, then that’s a bias.” These observations further point to the need for LinkedIn to reexamine its algorithm and take steps to treat all demographics more fairly.

Calls for Accountability and Transparency

Fortunately, awareness of these issues is increasing among industry professionals and consumers alike. They are now calling on LinkedIn to be held accountable for any biases found in its algorithm. Marilynn Joyner expressed her frustration: “I’d really love to see LinkedIn take accountability for any bias that may exist within its algorithm.”

LinkedIn’s Head of Responsible AI and Governance, Sakshi Jain, has reiterated that demographic information is not used as a signal for visibility. Chad Johnson emphasized that the changes, such as deprioritizing likes and comments, have made things more complicated. These amendments create a burden in measuring engagement that cannot be applied equitably across various user demographics.

The algorithm’s complexities extend well beyond gender. It’s a fascinating read because it captures all of the different factors they use to decide what content gets priority and what gets cut. Brandeis Marshall noted that “what we don’t know of is all the other levers that make this algorithm prioritize one person’s content over another.” This pronouncement highlights the vacuum of transparency that exists when it comes to how we curate and serve up content in the platform.