April 22, 2021

JT NEWS

every news you want

Not OK, computer: music streaming diversity problem

4 min read


Sexism can be a subtle problem. In the music industry, for example, we’ve not only had #MeToo scandals exposing the abuse of male singers, musicians and producers, but we’ve also seen less obvious ways where women seem to be at a disadvantage.

Take people’s listening patterns on streaming services. If you look Top 10 most streamed artists on Spotify in 2020, for example, only two are female – and Billie Eilish is the seventh highest. It may not sound like a case of discrimination, but how we got here raises important questions.

Today, a team of European IT scientists explored this trend by examining the algorithms of streaming services. More specifically, Christine Bauer from the University of Utrecht in the Netherlands and Xavier Serra and Andres Ferraro from the Universitat Pompeu Fabra in Spain analyzed publicly available listening recordings of 330,000 users of a service. This shows that female artists represent only 25% of the music listened to by users. The authors wrote on The Conversation platform that ‘on average, the first recommended track was by a man, with the next six. Users had to wait until the seventh or eighth song to hear one from a female. “

People learn about their musical tastes in all kinds of ways, but the way most of us listen to music now poses specific issues of ingrained bias. When a streaming service offers music recommendations, it does so by studying the music you’ve already listened to. This creates a vicious feedback loop, if it already delivers more music by men, which has surprising consequences – which most of us listeners are unaware of.

is there a solution? The researchers came up with one: they simulated the algorithm and tweaked it several times to increase the ranking of female artists (i.e. they get more exposure by being recommended more. early) and lower that of men. When they let this system work, a new feedback loop emerged: the AI ​​indeed recommended female artists earlier, which made listeners more aware of this option; and when the AI ​​platform learned that the music was chosen, it was recommended more often.

Bauer tells me it was “a positive surprise” to change the streaming service’s apparent bias so much with some algorithm tweaks. “Of course, it’s always easier to fix something in theory rather than in practice,” she says, “but if that effect were similar in the real world, that would be great.” She adds that the group is currently exploring how to use the same approach to tackle ethnic and other forms of discrimination on media platforms.

The team points out that this work is still at an early stage, but the study is thought-provoking for two reasons. First, and obviously, it shows why it pays to have a broader debate about how the now ubiquitous AI programs work and, most importantly, if we want them to extrapolate from our collective past to our future. “We are at a critical juncture, one that requires us to ask tough questions about how AI is produced and adopted,” writes Kate Crawford, who co-founded an AI center at New York University, in a powerful new book, AI Atlas.

Second, music streaming should also make us think about the thorny issue of affirmative action. Personally, I have often been wary of this concept, as I have built my career trying to avoid defining myself by gender. But today, after years of working in the media, I also realize the power of the ‘demonstration effect’: if a society only sees white men in positions of power (or on the pages of newspapers). , it creates a cultural feedback loop, not unlike these streaming services.

It affects many aspects of business. Think about venture capital: research from a multitude of groups shows that diverse teams outperform homogeneous teams. Yet, according to Deloitte, 77% of venture capitalists are men and 72% white, while black and Latino investors received just 2.4% of funding between 2015 and 2020, according to Crunchbase.

This pattern did not emerge primarily because powerful people are overtly sexist or racist; the more subtle problem is that financiers prefer to work with colleagues who fit their “culture” well (that is, who look like them) and support entrepreneurs who have a proven track record – except that most of these entrepreneurs look like them.

“Traditional investors generally view funds led by people of color and women as higher risk, despite widely available evidence that diversity actually mitigates risk,” note financiers Tracy Gray and Emilie Cortes in the Stanford Social Innovation Review. You can fix this by using something akin to a musical algorithm rework: Foundations could deliberately elevate various employees and over-invest in funds managed by various groups to change the feedback loop.

Would that work? No one knows yet because it has never been done on a large scale, or at least not yet in finance. The reality is that it’s probably even harder to change human biases than it is to change an algorithm.

But if you want a reason to feel upbeat, consider this: While computer programs can entrench existing biases, the astonishing levels of transparency that big data can provide is able to illuminate the problem with clarity. This, in turn, can galvanize action, if we choose it – in music and elsewhere.

Follow Gillian on Twitter @gilliantett and send him an e-mail at gillian.tett@ft.com

To follow @FTMag on Twitter to discover our latest stories first





Source link

Leave a Reply

Your email address will not be published. Required fields are marked *