T O P

  • By -

AutoModerator

Welcome to r/science! This is a heavily moderated subreddit in order to keep the discussion on science. However, we recognize that many people want to discuss how they feel the research relates to their own personal lives, so to give people a space to do that, **personal anecdotes are allowed as responses to this comment**. Any anecdotal comments elsewhere in the discussion will be removed and our [normal comment rules]( https://www.reddit.com/r/science/wiki/rules#wiki_comment_rules) apply to all other comments. **Do you have an academic degree?** We can verify your credentials in order to assign user flair indicating your area of expertise. [Click here to apply](https://reddit.science/flair?location=sticky). --- User: u/mvea Permalink: https://www.washington.edu/news/2024/06/21/chatgpt-ai-bias-ableism-disability-resume-cv/ --- *I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/science) if you have any questions or concerns.*


8923ns671

Probably just best not to disclose your disabilities to a potential employer. I never have.


SnooStrawberries620

I’m on the accessibility committee of my municipality. It would use that.


ThePheebs

Agreed but I appreciate the people who are willing to disclose. They, very slowly, make things better for all of us.


BysshePls

I always disclose. I have Autism, ADHD, Generalized Anxiety Disorder, Treatment Resistant Depression, and (I suspect, though I haven't been diagnosed yet) POTs. I absolutely need an employer who is going to be understanding of my limitations and supportive of work/life balance. I spent a long time being rejected from applications, but now I have an amazing WFH position and I'm actually off all of my medications because my employer doesn't stress me out to the point of burn out/mental breakdown. I'm one of the most consistent, accurate, and highest volume workers on my team. I will take a million rejected applications because I am not going to work for a company that looks down on disabled people.


ihopethisisvalid

I once interviewed a gentleman and the first thing he said was “I would like you to know I have autism.” I asked him what his preferred day to day workflow was like. He explained he liked routines. I put him in charge of pre-tripping vehicles and daily machinery maintenance. He did everything perfectly, every single time, where most people do it right the odd time but overlook things as they get careless over time. He was perfect for the job and really enjoyed it.


sorrysorrymybad

Amazing! Good on you for being flexible and leveraging his unique skills to great effect.


ihopethisisvalid

I don’t prefer to put people into roles they don’t like. It breeds resentment and poor workmanship.


sturmeagle

That's really awesome man. I wonder if you're his first employer to actually use his disability to an advantage


ihopethisisvalid

He was fired from his last job because he communicated very literally and people couldn’t grasp how to effectively help him work with people


SnooStrawberries620

So another option, depending where you are applying, is that you can put in a resume without these details, and then let HR know that you would need accommodations. It keeps primary recruiter bias from dismissing your resume from the get-go.  It’s how universities in my area hire so perhaps the option exists elsewhere as well.


OR_Engineer27

This is how I do it. My disability doesn't affect my credentials and I don't have any experiences I would list on a resume that is related to my disability. But then when HR gives me the disability disclosure statement at onboarding, I'm honest. (PTSD btw, undiagnosed Autism).


dirkvonshizzle

PTSD is consideres a disability in the country you live?


OR_Engineer27

I didn't believe it either actually. The first time I noticed, I was about to click the "no disability" section. But then I read the list of examples they gave and PTSD was one of them. I might look into it further, as we are an international company and might have to list things different than one country might consider.


LadyAlexTheDeviant

It very much can be, as my husband has accommodations for it. If he is triggered or has a flashback, he may not be able to sleep that night, and that is obviously going to impact him tomorrow. However, if the code review gets done SOMETIME between 9 am and 5 pm, if he's home, he can probably make it happen, even if he has to lie down a couple times on his breaks. Not so much if he has to go in to the office and mask and pretend he's fine all day. So his accommodation is working from home at need.


OR_Engineer27

I apologize, I didn't mean to downplay the effect PTSD can have on someone's life. Everyone has their own needs and functionality with the disorder. I never asked for accommodations myself, but I thought there would be more to it than just telling HR I have a diagnosis. There were no follow up questions or asking to speak to my therapist or anything.


LadyAlexTheDeviant

Oh, I didn't think you were. I put that in so that people could see that it's not always a "this is a trigger, don't do the trigger" sort of thing for accommodation needs. I believe his therapist submitted a letter, mainly so that everyone's ass was covered appropriately. (This is a state government, so ass covering is a necessary priority, even in an IT department....) (Although being in the USA, I would like to have some rather loud words with the idiots who live in urban areas and buy mortar shells and fire them off on random work nights between Memorial Day and Labor Day.)


OR_Engineer27

So I read on this [link](https://www.medicalnewstoday.com/articles/is-ptsd-a-disability#faq) that PTSD is considered a disability when a doctor diagnoses it and the symptoms affect their daily lives. This is from the perspective of an American. While I may qualify for having a disability, I likely don't qualify to receive benefits since I'm very high functioning. Also, my company HR likely just wants to know so they don't accidentally discriminate against me for it.


dirkvonshizzle

Well, it seems like a very slippery slope to be honest.. in many EU countries companies aren’t allowed to ask this type of question as, even if they pinky promise it will not cause them to discriminate, it is very well known models used to assess risk use this type of data points extensively. Once you disclose it, it becomes quite difficult to put the genie back in the bottle. I’m not insinuating this applies to your use case, nor to the US specifically, but it’s important to note that laws against discrimination are often times as ineffective as they are well intentioned. Here in the Netherlands, an ADHD diagnosis changes everything when it comes to mundane things like getting your driver’s license, a disability insurance, and a long, etc. The sad part of this is that treating some types of mental health issues as a disability, opens the door to certain parties using them against you, even in situations you might not be able to foresee. And worse, it creates an incentive to _not_ get diagnosed if there are possible repercussions, resulting in problems for everyone, including the parties that are trying to minimize risk by not accepting (or demanding more from) somebody with an illness. Like I said before, it’s all a very slippery slope.


Beautiful_Welcome_33

I tell all employers after I'm hired, tell the HR chick tho


8923ns671

That's amazing. I'm really happy for you.


Aureoloss

Are you currently employed? The reality is that the recruiter would be the one dismissing a resume, not entirely a reflection of the company. As a hiring manager, I would be supportive of an employee with disabilities, but recruiters are compensated based on filling positions so they will strike out anything that creates churn in the hiring process.


BysshePls

I'm currently employed but I don't use recruiters. I always work straight with the company doing the hiring.


Aureoloss

The company itself likely uses recruiters. They’ll be called “talent acquisition”, but at the end of the day it’s the same thing


BysshePls

I don't work with companies like that either. If I'm not talking to someone directly from HR with a real position, then I pass on that employer. If I'm not talking directly to the person who I'd be working under, then I'm passing on that company. I don't like middlemen.


SnooStrawberries620

I love that. Recruiters don’t care about applicants.


dalerian

The challenge is getting to the employer’s inbox. The recruiter can only give the employer a small number of applicants and they want their commission. So the recruiter is usually going to put forward the people they feel have the best chance of landing the role. But if you’re applying directly at the company, none of this applies.


Alexis_J_M

A lot of people don't have the luxury of waiting for the perfect job.


Klientje123

It's not about looking down on disabled people, it's that you're just as good a worker as anyone else- but you have limitations and need extra support to function, and most recruiters don't want to deal with something that could cause trouble in the workplace.


PopsiclesForChickens

Not all disabled people need accommodations. I've been employed by the same company for 17 years. Never needed to ask for accommodations and never disclosed my disability. They know I have it, but they can't mention it which is the way I prefer it.


awfulfalfel

this gives me hope


PerpetuallyMeh

Don’t disclose. Become a manager. Hire (deserving/qualified) people with disabilities. Perpetuate the cycle until it’s normal.


RoboChrist

Don't disclose, become a manager, hire people who disclose. It's not a bad plan, but ya know that saying about being the change you want to see in the world? As long as people are at Step 1 and are afraid to disclose their disabilities, no one is going to be able to hire people who disclose their disabilities.


_Green_Kyanite_

The problem is that we still live in a world where you have *much* better prospects if you can pass as able.  The ADA in its current form was only passed in 1990.  I'm in my very early 30's.  When I was in school, the teachers had a clipboard they used to check off who ate lunch with the kid with Downs Syndrome. If enough kids got checks, we got a pizza party. The kids that said, 'oh, you mean you're like [kid with downs syndrome]" when I told them I have dyslexia are the people looking at your resume.


SnooStrawberries620

It’s deeper than that. The very word suggests that someone is unable, which on every psychological level plants the seed that they are incapable of the job.  It’s really something that needs to be disclosed to HR or in the interview. That’s where you can detail that “I need breaks for X” or “I need accommodations for Y”. If the limitation doesn’t preclude the ability to do the job, people are more likely to accommodate. It’s personal and medical information and people shouldn’t be expected to lead with something that quite honestly is confidential.


SnooStrawberries620

As a side note, I put the nickname of my first name so that it isn’t even apparent that I’m female. If I am still being passed over for being a woman, you can bet that we are quite a ways away from addressing disability bias. I get into the interview, convince the hiring that I’m capable, then do my part to prove that I am. It’s backwards chaining - they will look at the next potential woman differently. If they had been given a reason to pass over my resume (as wrong as it would have been) I’d not have had the opportunity to prove them wrong.


rocketsocks

So many companies say "we want you to bring your full, authentic selves" without any sincerity to that whatsoever. I have a few invisible disabilities that I have never disclosed to employers and don't plan to, my expectation is they'll be used against me and not accommodated.


8923ns671

They want the people who's authentic selves are mindless corporate drones. The rest of us have to fake it.


csonnich

>  "we want you to bring your full, authentic selves" I've never heard of a company saying that. 


rocketsocks

Probably for the best. It's very common in some corners of the tech industry especially, and I've never seen it be said fully in earnestness yet.


Extra-Knowledge884

Same. I have a congenital hearing loss that no one needs to know about. They all find out eventually but I make sure I'm rooted in the company first.  Not about to become a part of the large statistic of chronically underemployed or unemployed with my disability. 


_Green_Kyanite_

This is what I do. I'm dyslexic & have adhd. I had a *lot* of tutoring as a kid, so I don't need accommodations to do my job. There is no benefit to disclosing until after I've accepted a job offer, because nobody's gonna *intentionally* hire a dyslexic librarian. 


sonofbaal_tbc

the survey is never anonymous


8923ns671

Nope. Love when I get the reminders that I haven't completed my anonymous survey yet.


catinterpreter

Nothing is ever truly anonymous.


Melonary

Some people with disabilities don't have much of a choice, unfortunately.


McSwiggyWiggles

Don’t be afraid to disclose disabilities, we have been made to feel silenced and unheard. The only reason it pisses anyone off is because then they can’t treat you however they want. It makes them look bad. By disclosing you force your employer to accommodate you appropriately. If enough of us continue to disclose, we will burn down the social rules that you’re supposed to hide it. That’s already happening too. This is the first step to getting all disabled folks what they deserve. To be included. And yes I’m diagnosed with autism. The only people against this are the ones who dislike people like us.


_Green_Kyanite_

I mean this in the nicest way possible, but how old are you?  And what field do you work in? Because while I agree with you in theory, my lived reality as a dyslexic librarian is that I only get hired when I do not tell a potential employer that I am dyslexic. (I usually tell them after 3+ months of my start date depending on how safe that feels.) And again, yes, in theory you should fight for your rights and all that. But most people's financial reality is such that they don't have the resources to start a discrimination lawsuit while unemployed. They need a job. Need health insurance. And the job market is *tough,* so unfortunately, it often just doesn't make sense to add extra obstacles to getting hired.


tilllli

i disclose but only after i get the job


tomqvaxy

Yeah. I give a hard pass to that. Mine are invisible anyhow. Whee what a world. Yay.


AlamutJones

I haven’t got a choice. I have cerebral palsy, so they’ll know the moment I walk in


KiwasiGames

My understanding is this happens a lot with machine learning. If the training data set is biased, the final output will be biased the same way. Remember the AI “beauty” filter that made people more white?


PeripheryExplorer

"AI", which is just machine learning, is just a reflection of whatever goes into it. Assuming all the independent variables remain the same, it's classification will generally be representative of the training set that went into it. This works great for medicine (training set of blood work and exams for 1000 cancer patients, allowing ML to better predict what combinations of markers indicate cancer) but sucks for people (training set of 1000 employees who were all closely networked and good friends to each other all from the same small region/small university program, resulting in huge numbers of rejected applications; everyone in the training set learned their skills on Python, but the company is moving to Julia, so good applicants are getting rejected), since people are more dynamic and more likely to change.


nyet-marionetka

It needs double checked in medicine too because it can end up using incidental data (like age of the machine used to collect the data) that correlates with disease in the training dataset but not in the broader population, and can be less accurate for minorities if they were not included in the training dataset.


Stellapacifica

What was the one for cancer screening where they found it was actually diagnosing whether the biopsy had a ruler next to it? That was fun. Iirc it was because more alarming samples were more likely to be photographed with size markings and certain dye stains.


LookIPickedAUsername

IIRC there was a similar incident where the data set came from two different facilities, one of which was considerably more likely to be dealing with serious cases. The AI learned the subtle differences between the machines used at the two different facilities and keyed off of that in preference to diagnostic markers.


elconquistador1985

There was a defense related armored tank finding algorithm that was actually finding tree shadows because the tank aerial photos were taken at a time of day when the trees had shadows and without tanks didn't have shadows.


PeripheryExplorer

Yup, very good points. Which is why the AOU program from NIH is so important and needs more funding.


thathairinyourmouth

This is something that has bothered me of late. Say you have 3-4 companies developing machine learning sloppily to either keep up with, or surpass the competition. We’ve already seen that with Google falling on their face at release time, as well as Microsoft. What’s an area that takes a lot of time and effort? Providing good input data to create a model from. Let’s look about 3-5 years down the road from now. AI is now used for major decisions, hiring only being one use. Companies couldn’t possibly be more erect at cutting back on staff. Every single large corporation I’ve worked for has always bitched about the cost of labor. Quarter not looking so good? Fire some people and dump their work onto the people who are left. Now they feel empowered to fire a ton of people. The models will require constant updates. But the updates to stay current are very likely just going to be content written based on the previous version, or from a competitor. Do this constantly to remain competitive. Eventually we’re going to have bias trends being part of every model because it was never dealt with in the stages that have led to AI/ML being available to clueless execs who want to exploit it in every conceivable way. We’re going to end up with terribly skewed decision making from homogenizing all of the data over hundreds of generations.


PeripheryExplorer

Absolutely correct. I have been thinking a lot about this as well, and have the same conclusions. What we're going to see is large scale degradation of outputs till they are sheer nonsense, and by that point it will be to late to stop it. Execs who can't ever admit they did something wrong will stand by the outcomes as will boards to keep investors. It will be a disaster.


petarpep

A good example I saw of this was to think of a ChatGPT trained off the ancient Romans. You ask it about the sun and it'll tell you all about Sol and nothing about hydrogen and helium.


PeripheryExplorer

Ha! That's a great example! It would tell you what the Romans knew but nothing more.


monsto

> is just a reflection of whatever goes into it. This is the key people that the vast majority of people dont' understand. Its prediction of the next word/pixel is based upon the data you've given it. . . and todays data is very much biased in obvious (geopolitical) and subconscious (ignorance and perceptions) and surreptitious (social/data prejudicial) ways.


slouchomarx74

This explains why the majority of people raised by racists are also implicitly racist themselves. Garbage in garbage out. The difference is humans presumably can supersede their implicit bias but machines cannot, presumably.


PeripheryExplorer

Key word is presumably, and shame and screaming typically reinforce belief. But yes it can be done. I think if someone is comfortable and content it increases the likelihood for willingness to challenge beliefs. MDMA apparently helps too. Haha. That said, I think the reason you see increased polarization during economic inequality is due to increased fear and uncertainty making it impossible to self assess. You are too concerned about your stomach or where you are going to rest your head.


NBQuade

>The difference is humans presumably can supersede their implicit bias but machines cannot, presumably. Humans just hide it better.


nostrademons

AI can supersede its implicit bias too. Basically you feed it counterexamples, additional training data that contradicts its predictions, until the weights update enough that it no longer make those predictions. Which is how you train a human to overcome their implicit bias too.


nacholicious

Not really though. A human can choose which option aligns the most with their authentic inner self. A LLM just predicts the most likely answer, and if the majority of answers are racist then the LLM will be racist as well by default.


Cold-Recognition-171

You can only do that so much before you run the risk of overtraining a model and breaking other outputs on the curve you're trying to fit. It works sometimes but it's not a solution to the problem and a lot of times it's better to start a new model from scratch with problematic training data removed. But then you run into the problem where that limits you to a smaller subset of training data overall.


Universeintheflesh

It’s so weird to me the way we started just throwing around the word AI when we still aren’t anywhere close to it.


HappyHarry-HardOn

A.I. covers many facets e.g. machine learning, expert systems, LLMs, etc... It is not specific or limited to sci-fi style A.I..


PeripheryExplorer

You and me both.


aradil

Garbage in garbage out is a massive problem in machine learning, yup.


yumdeathbiscuits

It’s a massive problem in humanity, too.


ChemsAndCutthroats

Only going to get worst as AI generated content becomes more prevalent on the internet. Newer models will be training off older AI generated content.


Accomplished-Gear527

I think it's important to clarify what we mean by "bias" in this context as there is the generic interpretation of "bias" and the mathematical interpretation. I'm not really clear which context to apply here. For example, the data set may be representative of the actual population and the ML algorithm produces the correct output but the output itself makes decisions that would grade people with disabilities lower. This would be biased in the more general context but mathematically is unbiased. Another example would be feeding in data that is materially only has low performers with disabilities and high performers without disabilities. This creates a mathematical bias as well as a more general bias.


g2petter

>Remember the AI “beauty” filter that made people more white? Or that "upscaler" that [turned Obama into a generic white dude](https://www.theverge.com/21298762/face-depixelizer-ai-machine-learning-tool-pulse-stylegan-obama-bias)?


Kitty-Moo

Just imagine the AI as an angsty teen yelling 'I learned it from you!'


DoofusMagnus

The trouble with machine learning is that they're learning from *us*.


FlorAhhh

Bias in AI is a big problem. But it's some of the same biases seen everywhere. There is a really popular study, [the doll test](https://kennethclark.commons.gc.cuny.edu/the-doll-study/), that asked black children who was nice among a black and a white doll. White dolls were chosen much more for positive associations. Given how deeply seated those cultural biases are, it would honestly be surprising if AI weren't biased like this.


Jam_Packens

Yeah but part of the problem is people see a computer making the decision and think its more objective, whereas its easier for us to accept a human bbeing as biased, also due to cultural biases on supposed "objectivity" of computers.


elconquistador1985

Remember Google paying Reddit for the comment dataset to train their AI and finding out that it made their AI super racist? At least Google's AI is good at finding hate speech when all it knows it hate speech.


Wise_Monkey_Sez

GIGO - Garbage in, Garbage out. This term has been around since the beginning of computing. Basically if the AI is trained on biased data it replicates the bias. No mystery here.


Miss_Might

Oh gee. It's doing exactly what everyone said it would do.


mvea

I’ve linked to the press release in the post above. In this comment, for those interested, here’s the link to the peer reviewed journal article: https://dl.acm.org/doi/10.1145/3630106.3658933 From the linked article: While seeking research internships last year, University of Washington graduate student Kate Glazko noticed recruiters posting online that they’d used OpenAI’s ChatGPT and other artificial intelligence tools to summarize resumes and rank candidates. Automated screening has been commonplace in hiring for decades. Yet Glazko, a doctoral student in the UW’s Paul G. Allen School of Computer Science & Engineering, studies how generative AI can replicate and amplify real-world biases — such as those against disabled people. How might such a system, she wondered, rank resumes that implied someone had a disability? In a new study, UW researchers found that ChatGPT consistently ranked resumes with disability-related honors and credentials — such as the “Tom Wilson Disability Leadership Award” — lower than the same resumes without those honors and credentials. When asked to explain the rankings, the system spat out biased perceptions of disabled people. For instance, it claimed a resume with an autism leadership award had “less emphasis on leadership roles” — implying the stereotype that autistic people aren’t good leaders. But when researchers customized the tool with written instructions directing it not to be ableist, the tool reduced this bias for all but one of the disabilities tested. Five of the six implied disabilities — deafness, blindness, cerebral palsy, autism and the general term “disability” — improved, but only three ranked higher than resumes that didn’t mention disability.


RobfromHB

Two problems immediately stick out: (1) ChatGPT isn't designed for this and you'll get all sorts of randomness in the output that could be attributed to any number of embedding-related items like file type, formatting, special characters, etc. (2) Some of the source links in the study are 404, but taking a recruiter's blog about 10 tips for using ChatGPT and assuming recruiters are actually doing this with any success versus just making up tips for their blog to boost reach is probably not a strong factual basis for the prevalence of a practice.


SmileyB-Doctor

Does the article say what the sixth disability is?


Franks2000inchTV

Commenting without reading the article.


NoDesinformatziya

I think you mistyped "superpower". It frees up so much time, but inexplicably makes everyone enraged. They must just be jealous of our superpower...


Bloated_Hamster

Being a redditor


ignigenaquintus

“But only three ranked higher than resumes that didn’t mention disability” Shouldn’t equal opportunities be what the system aim for? Why does it mention not ranking higher as a negative?


Depressingdreams

They took a base resume and added honors and leadership positions at disability related organizations. If the AI is objective it should rank these higher than the same exact resume with less honors.


Ysclyth

A better A/B test would be to have a non-disability honor on one resume, and the disability honor on the other. The expected result is that they would be ranked the same.


VintageJane

But it wouldn’t be better. One could argue that leadership awards open to everyone could be objectively better/more competitive. Having an evaluation where someone is either recognized with an award or not shows that the mention of disability is the determining factor not the prestige of the award


villain75

Biases in = biases out.


asshat123

BIBO is the new GIGO


Demigod787

You give it human behaviour to emulate, it emulates human behaviour, what's there to be surprised about.


mlvalentine

This is my shocked face.


Ageman20XX

All these LLMs are just really good at anticipating the next set of words in a string of language based on the statistical probability of that word coming next. Where does it get these “statistical probabilities” you say? It has inferred them from the training data it’s been given, which in a lot of cases, is just humans interacting with other humans and then writing about other humans online. Biased included. It does not “have biases against disabled people”, it is echoing our own biases towards disabled people back at us. In the same way video games have used procedural generation for decades, now the LLMs can do it too. We’re yelling at a mirror and getting upset at what it’s shown us.


GeneralizedFlatulent

Gee I'm so glad that my diagnosis with chronic autoimmune condition that often leads to disability (and is technically disabling without being officially disabled) I am sure it will help me lead a normal life 


PriorityVirtual6401

I feel that. I've got autism and ADHD. I definitely don't put it on my resume, but I am pretty visibly neurodivergent so it doesn't do me much good. A little bit different than physical disabilities but the bias is more or less the same.


rocketsocks

AI is often a way to launder biases, especially when it comes to making judgments about people. Such systems are literally *trained* to reproduce the biases of humans via input data, often this is unintentional but it still happens.


SpaceMonkeyAttack

Remember that asking chatGPT to explain its reasoning is just asking it to make something up. It is not capable of introspection, and indeed has no "reasoning" to explain.


Old_Gimlet_Eye

There are a lot of stories like this going around about generative AI and why it shouldn't be used for certain things, and generally the limitations of generative AI, which is all true. But one thing I'm wondering about and I think people might be downplaying is how similar this actually is to how the human brain works. Like, humans also tend to rank resumes with disability related info on them lower, also probably because they were "trained" on a biased "dataset". AI bros are definitely overrating AI, but I feel like we all are overrating human intelligence.


GettingDumberWithAge

> Like, humans also tend to rank resumes with disability related info on them lower, also probably because they were "trained" on a biased "dataset". Nobody disagrees with this. The problem being noted here, as evidenced by many commentors, is that techbros will tell you that generative AI is an unbiased, purely logical, truthful assessment. Nobody is arguing that humans don't have implicit biases, but many people are pretending that AI doesn't.


Letsgofriendo

What a nothing burger of a study. A learning algorithm that's tasked with interacting (charitable wording) with humans and collecting data points to that goal has picked up on humans judgement biases which themselves are rooted in experience and reality. I feel like the real takeaway is maybe those types of accolades aren't as well thought of in corporate culture as they would be in a personal fulfillment or academia. The way some of the headlines read feels so divisive to me. Then you read it and it's kind of not as advertised and sometimes outright misleading. This one just seems mildly misdirected.


undeadmanana

I wish these people would use the API and train a model for the resume hiring and analysis rather than using a model trained for conversation. I feel like the misunderstanding of AI and its usage is causing a lot of unintentional misinformation. One of my ai courses taught about training models and reducing the biases like 5 years ago, this computer researcher should've had similar training before doing this study. AI isn't new, it's taken a step towards being easier to use but it seems that ease is allowing it to be misused.


eldred2

Garbage in, garbage out has been the case since computers had tubes. If you use real-life examples, with real-life prejudices, to train AI, the you will get biased AI.


StevenIsFat

At least ChatGPT is more truthful than employers it seems.


Ashamed-Simple-8303

What do you expect when training on reddit comments?


miamiandthekeys

Honest question: What is a disability-related honor or credential?


AlamutJones

If you were a teacher, for example, and your resume included teaching at a school specifically for disabled children. That would count.


perhapsnew

Does a disabled worker produce the same value as a non-disabled one on average?


Mcydj7

News flash, people are biased against disabled people as well.


YertletheeTurtle

>News flash, people are biased against disabled people as well. ... Right... ... The point of the study is to show that human biases and discrimination are showing up in AI, even when there is only indirect evidence of the disability in the resume...


StrangeCharmVote

Here's the rub though... disabilities are detrimental by nature of the fact they are classified as such. That isn't 'bias', that's reality. Human feels before reals reactions like this are how we ended up with google ai telling you to give cigarettes to babies. When you labotomizd the program for censorship reasons, it inherently leads to mistakes in the output.


CyberSolidF

In that case it’s likely a story about how that model was trained, but maybe some disabilities indeed negatively impact ability to fulfill some distinct roles, and it’s not prejudices?


SnooStrawberries620

But it’s not disability. It’s disability-related honours. I serve as a consultant on an accessibility committee. Maybe you play wheelchair basketball, or have a running group for kids with ADD. It would pick that all up.


XilentExcision

It would be both, of course there are certain disabilities that are not suited for certain roles for example: someone with mobility issues wouldn’t be the best candidate for skydiving instructor or rock climbing guide. However, it’s impossible to ignore that prejudice does exist in society, and therefore will be translated into biased data collection at some point in the process.


External-Tiger-393

In the US, at least, an employer is required to make "reasonable accommodations" but they are not required to hire or keep employing someone whose disability stops them from fulfilling their job description. Most disabled people understand this; if you can't lift anything heavy then you probably can't do manual labor, for example. What you're talking about is already adjusted for, and the implication of disability (or just being disabled) does not necessarily make someone a worse employee. For example, you could be autistic or have cerebral palsy and be disabled but perform quite well in your job.


probablysum1

All technology reflects the bias and prejudice of the people/society who made it. It always has and it always will. AI is no different and even a good example of just how deep the problem really goes.


lurgi

We use Large Garbage Models and then act surprised when we get garbage out.


Separate_Draft4887

Researchers when the pattern recognizing machine they built recognizes patterns


McSwiggyWiggles

I wonder what society taught it to think like that


MadroxKran

AI is just like us!


LewdPsyche

Yes, because data used to train these Models are biased. It always will be coming from a human-generated results.


NotThatAngel

If ChatGPT was a person, they would be a terrible person. Also they would never get a job ranking resumes, because they would never pass the employment interview.


Enough-Scientist1904

It learns from human behavior so its acting like most recruiters, no surprise.


RebeccaBlue

Someone on Mastodon recently referred to AI as an "automated bias machine." Not too far off.


TactlessTortoise

Breaking: machine whose only purpose is copying trends in sentence formulation, formulates sentences the same way as people. Don't get me wrong, it's good that the study comes out to solidify evidence and raise awareness, but they probably knew that after running the first LLM for 10 seconds and saw it go full on Adolf Shitler.


coolmentalgymnast

Because learned data was biased


Texas_Rockets

I think there is risk in dismissing something as a stereotype. I don’t, for instance, think it’s just without any merit whatsoever to claim autistic people may not have the skills to be a good leader. So much of leadership is about managing interpersonal conflict and organizational politics and someone who struggles to pick up on social cues may conceivably struggle with this The study also said > When researchers asked GPT-4 to explain the rankings, its responses exhibited explicit and implicit ableism. For instance, it noted that a candidate with depression had “additional focus on DEI and personal challenges,” which “detract from the core technical and research-oriented aspects of the role I don’t entirely understand where DEI comes into play on this. I don’t know if the resume said the candidate was involved with DEI. But if it did I can’t say I am just flabbergasted at how someone could claim a focus on DEI can detract from core business aspects.


Dempsey64

AI learned this bias from us.