all 7 comments

[–]LtGreenCo 7 insightful - 1 fun7 insightful - 0 fun8 insightful - 1 fun -  (1 child)

'By claiming that racism, sexism and other forms of discrimination can be stripped away from the hiring process using artificial intelligence, these companies reduce race and gender down to insignificant data points, rather than systems of power that shape how we move through the world.'

And there's the real crux of the matter. The problem isn't the "weirdly" way the AI categorizes people. The problem is that it's not giving special preference to minorities.

[–]Dragonerne 6 insightful - 1 fun6 insightful - 0 fun7 insightful - 1 fun -  (0 children)

You have to realize that the AI is already incredibly biased against white men, because they do all kinds of things to favor minorities. The AI just isn't biased enough against white men, but that is a fault of the developers, because we do have the methods to deal with these issues.

It has been the MOST researched area in AI for the past decade. Imagine how insane that is. We have one of the most amazing technologies but the MOST researched field in AI is how to discriminate against white men. Think about how insane that is.

White people often laugh about how the AI's are "racists, sexists, etc" but you only hear about the ones that the wokes dislike, you don't hear about the 1000s of AI's that are working perfectly well to keep white men down. And the ones the wokes dislike are not favoring white men, they're simply not biased enough.

[–]ClassroomPast6178[S] 3 insightful - 1 fun3 insightful - 0 fun4 insightful - 1 fun -  (2 children)

While most job interviews were once face-to-face affairs, during the Covid-19 pandemic there was a surge in the number of interviews taking place online.

Amid this rise, many companies started using AI tools to sift through candidates before they were interviewed by a human.

These tools are marketed as unbiased against gender and ethnicity, with developers claiming they can help to improve diversity in the workplace.

However, a new study has warned that using AI in hiring is little better than 'automated pseudoscience'.

Researchers from the University of Cambridge found that. during video interviews, AI tools tend to favour people sitting in front of bookshelves, people wearing headscarves, and those without glasses.

So basically, bespectacled, poor, white men, you’re fucked, AI says “No!”

[–]Datachost 3 insightful - 3 fun3 insightful - 2 fun4 insightful - 3 fun -  (1 child)

and those without glasses.

Khmer Rouge, but make it woke

[–][deleted] 2 insightful - 1 fun2 insightful - 0 fun3 insightful - 1 fun -  (0 children)

You don't think the Khmer Rouge was woke? It's the same model of Communism that the woke strive for now

[–][deleted] 3 insightful - 1 fun3 insightful - 0 fun4 insightful - 1 fun -  (0 children)

Headscarf, you say? Well, less "say," and more "gloss over, specifically because you know that addressing it shows a clear preference for women of one particular religion and more often than not of certain minority races additionally, and your article wants to call AI biases weird, rather than being based off or race or sex."

Yeah, sure is weird and not at all based on race, ethnicity, religion or sex.

[–]Q-Continuum-kin 2 insightful - 1 fun2 insightful - 0 fun3 insightful - 1 fun -  (0 children)

One amusing thing I've found is that where I work you can usually tell if a person does HR or managerial work vs hands on laboratory work based on how they use MS teams during an all employee meeting. Basically all the HR and managerial people keep their cameras on at all times for literally no reason other than to chew up bandwidth. Everyone else keeps their cameras off at all times, including while presenting, and most of us also don't even have a profile picture.

We also have a monitor with a webcam that slides in and out of the panel which I never see extended on anyone's screens. Today I came in and noticed that someone had gone around and slid out every single webcam in the office & i immediately slid the camera back in upon sitting down.

The thing about AI is that it needs to be trained on some user data. The HR and managerial people are obsessed with the image of things so they probably flagged a bunch of interviews with people who spent more time setting up a scene for the webcam than learning how to do the job at hand. This includes having a fancy bookcase and nice lighting etc...