LastInALongChain t1_j1s5on6 wrote
Reply to comment by JMAN1422 in NYC's AI bias law is delayed until April 2023, but when it comes into effect, NYC will be the first jurisdiction mandating an AI bias order in the world, revolutionizing the use of AI tools in recruiting by Background-Net-4715
>How can AI be biased if it's only looking at raw data. Wouldn't it be inherently unbiased? I don't know just asking
Data can be bad looking at groups, not reflecting individuals.
If you have one person who belongs to group z, and this person is a criminal, steals, and commits assault, you wouldn't want to hire him. But the AI just choses not to hire him because he belongs to group z, and group z on average commits 10x the crime of any other group. It does the same to another guy of group z, who has a spotless record, or who has a brother that died to crime, so he is at risk of committing crime due to the association of others that revenge kill.
Basically AI can only see aggregate behavior, because judging individuals would require a level of insight that would require a dystopian amount of real time access to that persons data.
Technically an AI could look at groups and be like " On average these guys have good traits" but that's literally the definition of bigotry.
Viewing a single comment thread. View all comments