- cross-posted to:
- technology@lemmy.world
- cross-posted to:
- technology@lemmy.world
an AI resume screener had been trained on CVs of employees already at the firm, giving people extra marks if they listed “baseball” or “basketball” – hobbies that were linked to more successful staff, often men. Those who mentioned “softball” – typically women – were downgraded.
Marginalised groups often “fall through the cracks, because they have different hobbies, they went to different schools”
*video being part of an interview when being on video is a key part of the job. Still not seeing your logic in why a person would adapt a hiring process to accommodate people who literally can’t do the job. No amount of empathy on my part magically makes these people able to do the necessary “communicating in person and while on camera” portion of the job.
Would you expect someone hiring taxi drivers to design an application process that makes sure people who can’t drive are included? Or a coffee shop making it clear that people with severe allergies to coffee can apply and work there and… be chronically sick at work or worse?
I think you’re stretching things to absurdity and I can no longer tell if I’m being manipulated by a bit or you just refuse to read what I’m writing. Even LLM chat bots tend not to be so stubbornly black and white as this.
Are you a bot? Legally, you have to tell me the truth if I ask, just like an undercover cop has to tell. /s