Hitting the Books: AI could help shrink America’s gender wage gap

Wprospects have faced gender discrimination in the workforce throughout history, refused employment in all but a handful of menial roles, routinely passed over for promotions and pay rises—and rarely ever compensated at the same rates as their male peers. This long and storied socio-economic tradition of financially screwing over half the population continues largely unabated into the 21st century where women still make 84 cents on the dollar that men make . In her new book, The Equality Machine: Harnessing Digital Technology for a Brighter, More Inclusive FutureProfessor of Law and one of the founders of the Center for Intellectual Property Law and Markets at the University of San Diego, Dr. Orly Lobel, explores how digital technologies, often maligned for their role in exacerbating social ills, can be harnessed to undo the damage. they have caused.

Also Read :  Henry Cavill Talks Superman Return, Black Adam Cameo – The Hollywood Reporter
The Cover of the Equality Machine

Public Affairs

This article has been removed from The Equality Machine: Harnessing Digital Technology for a Brighter, More Inclusive Future by Orly Lobel. Copyright © 2022. Available from PublicAffairs, an imprint of Perseus Books, LLC, a subsidiary of Hachette Book Group, Inc.


For years, the double standard was evident: employers demanded salary confidentiality when asking potential employees about their salary history. Now, we can address both ends of this asymmetry. Just as digitization is helping to reverse the flow of information to foster greater market transparency about the value of workers, new laws are also directing employers to be less reliant on past wage levels, which can be tainted by systemic inequality . In 2016, Massachusetts became the first state to pass a law prohibiting employers from asking job applicants about their salary history. Since then, more than a dozen states have followed suit.

There are two aims to prohibiting employers from asking potential job applicants about their salary history. The first breaks the vicious cycle of the pay gap, which emerges when women are paid less in a previous job and that gap is then repeated by the next employer. The second addresses gender differences in the negotiation process Salary figures are plagued by gender differences, and can perpetuate and exacerbate existing differences in the market. When a woman reveals that she currently earns less than a man, it could be damaging her salary trajectory – both in the job she is applying for and for the rest of her career. Every time she discloses her current salary to a potential employer, that gap is likely to grow, as recruitment efforts and promotions are often offered as a percentage increase in relation to current base salary. Rather than relying on biased figures, bans on salary history inquiries lead employers to use other ways of determining a potential employee’s worth, including switching to automated calculation. Employers using market data and internal data may consider merit-related characteristics in determining pay, such as experience, training, education, skill, and past performance.

Yet, as we’ve seen, human bias can creep into our algorithms, and an algorithm fed data tainted by pay bias is likely to perpetuate that bias itself. Feedback loops are digital vicious circles that can lead to self-fulfilling consequences. Once again: bias in, bias out. The risk is that an algorithm will learn that certain types or categories of workers are underpaid on average, and then calculate that into salary offers. This is the wrong that recent policy is designed to eliminate – and that we can program AI to avoid. The removal of the anchor numerical figure encourages employers to proactively assess salary based on the needs of the company and the candidate’s fit rather than on a tainted number. At the same time, having information about a pay scale for a job but not having a salary history on the table can encourage women to ask for more.

What’s more, AI can also help in the future—perhaps not even the distant future—by replacing some of the negotiation that takes place in unequal settings. Empirical studies on negotiation differences between men and women have repeatedly shown that women on average negotiate less, and when they do, employers react negatively. Women do not ask for higher wages, better terms, promotions, or opportunities nearly as often as men. In my research, I have called this the lack of negotiation. In one study at Carnegie Mellon University, 93 percent of female MBA students accepted a starting salary offer, while only 43 percent of men did. In another study, female participants who simulated salary negotiations asked for an average of $7,000 less than male participants. Economists Andreas Leibbrandt and John List have also found that although women are much less likely to negotiate salary with employers, this difference disappears when all job seekers are specifically told that salary is negotiable, mitigating the pay gap. My own experimental research with behavioral psychologist and law professor Yuval Feldman, my longtime colleague, has found that in some work environments women behave less “homo economicus” – that is, like rational economic actors – and more like social actors altruistic, so that women do not demand as much for themselves as men, and are more likely to value non-financial benefits, such as a good corporate culture.

Can these research insights offer us clues for developing new software tools that will motivate women to discuss? Digital platforms can serve employees by providing advice and information on asking for a raise or preparing for an interview. Information about salary – and especially a specific expectation that pay can and should be negotiated – can empower candidates to negotiate higher salaries before accepting job offers. The digital platform PayScale conducts annual surveys asking thousands of job seekers whether they have disclosed their salary in previous jobs during the interview process. A 2018 PayScale survey found that women who were asked about their salary history and declined to disclose were offered jobs 1.8 percent less often than women who were asked and disclosed. In contrast, men who declined to disclose when asked about salary history received offers 1.2 percent more often than men who did disclose.

Even when women do discuss, they are treated differently. In my research, I call this phenomenon the negotiation penalty. Women are told to “lean in” and make demands, but the reality is that for centuries women have generally been viewed as weaker negotiators than their male counterparts. In one series of experiments, participants evaluated written reports of applicants who did or did not initiate negotiations for higher wages. The results of each experiment showed that participants penalized female candidates more than male candidates for initiating negotiations, assuming that women who asked for more were not “nice” or too “demanding.” While qualities such as assertiveness, strength and competitiveness are culturally beneficial to male negotiators, women who exhibit such traits are often viewed as overly aggressive. Another study looked at data from a group of Swedish job seekers and found that not only did women end up with lower salaries than equally qualified male peers, but they were often penalized for negotiating like them. Nick Yee and Jeremy Bailenson have shown that attractive avatars lead to more intimate behavior with confederates in terms of self-disclosure and interpersonal distance. In a second study, they also found that tall avatars led to more confident behavior than short avatars in a discussion task. They call it the Proteus Effect (the Greek god Proteus was known to have the ability to take on a lot of self-representation). The Proteus Effect suggests that visual characteristics and avatar characteristics are linked to interrelated stereotypes and behavioral expectations, including those that affect the way we negotiate.

The eleventh annual competition for artificial intelligence trained to negotiate—the Hagglebot Olympics, as it’s called in the popular media—was held in January 2021. Universities from Turkey and Japan won this time. In some experiments involving conversations with bots, most people didn’t even realize they were talking to a bot rather than another person – the bots had learned to hold fluent conversations that completely mimicked humans. Using game theory, researchers are increasingly improving the ways in which bots can negotiate on behalf of humans, removing some of the aspects in which we humans are fallible, such as trying to consider and weigh many different aspects on the deal. AI can now predict the other side’s choices quite quickly. For example, an AI that listens into a microphone to the first five minutes of a negotiation learns to predict much of the eventual deal from the negotiators’ voices alone. Following these speech patterns through machine learning, it appears that when a negotiator’s voice varies greatly in volume and pitch, he is a weak player at the negotiating table. When the negotiating parties mirror each other, it means they are closer to reaching an agreement. The use of AI has also helped to reveal the ways in which women are punished at the negotiating table. A new study from the University of Southern California used a chatbot that did not know the participants’ gender to evaluate negotiation skills. The study showed that most of us — men and women — do pretty poorly at negotiating salaries. Over 40 percent of participants did not negotiate at all, and most people left money on the table that they could have received. Women valued stock options less than men did as part of their compensation package, affecting the likelihood that women would accumulate wealth over time. These developments can also help negotiate differences across different identities. A group of Israeli and American researchers looked at how a smart computer can communicate with humans from different cultural backgrounds. Without telling the machine anything about the characteristics of people from three countries—Israel, Lebanon, and the United States—they let the AI ​​learn about patterns of cultural negotiation differences by participating in negotiation games. They found that the computer could outperform people in every country. These developments are promising. We can envision bots learning about negotiation differences and eventually counteracting such differences to create fairer exchanges, ensure fairness, and achieve fair outcomes. They can be designed to address the specific delivery goals we have.

All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories contain affiliate links. If you buy something through one of these links, we may earn an affiliate commission. All prices are correct at time of publication.

Source

Leave a Reply

Your email address will not be published.

Related Articles

Back to top button