I can imagine in the future there will be grid locks in front of the police station with AV cars full of black people when the cops send out an ABP with the description of a black suspect.
We’ve seen plenty of racist AI programs in the past because the programmers, intentionally or not, added their own bias into the training data.
I can imagine in the future there will be grid locks in front of the police station with AV cars full of black people when the cops send out an ABP with the description of a black suspect.
We’ve seen plenty of racist AI programs in the past because the programmers, intentionally or not, added their own bias into the training data.
Any dataset sourced from human activity (eg internet text as in Chat GPT) will always contain the current societal bias.