Submitted by Gigglemind t3_1247cf5 in news
RonBourbondi t1_jdzsrvn wrote
Reply to comment by PicklerOfTheSwamp in BBC News: Clearview AI used nearly 1m times by US police, it tells the BBC by Gigglemind
Sure your tune would change if this helped identify someone that kidnapped a child.
Mobely t1_jdzvrtd wrote
And if it identified the wrong person? Who could not produce the child they don’t have? And are sentenced to death form presumably murdering said child since the child is nowhere to be found?
An_best_seller t1_je0dfi6 wrote
Trigger Warning: Mass-shooting, rape.
I think that this technology shouldn't be used as evidence, but just as a tool to find potential (but not definitive) criminals/perpetrators and to find suspects way faster. However, the person should only be sentenced by a judge if they find evidence of the crime that is not based on their face.
Here are some examples:
- There is a mass-shooting. A camera films a video of the mass-shooter face. Police don't know where nor who the mass-shooter is. Police uses Artificial Intelligence to find people that have a similar face to the mass-shooter. They find 7 people with the same face in the USA. They start investigating each person and they find that 1 of the 7 face-suspects bought a gun with the same type of bullets that the ones found in the crime scene. They also find that this one suspect has shoes that match the shape of the blood footprints of the crime scene. And they find that the suspect had been searching in Google Maps the location of the crime scene before the mass-shooting happened. They arrest this one suspect and keep looking for more evidence and they finally go to trial, and the overwhelming evidence tells they are guilty, so they sentence them to life in prison or death penalty (I'm not going to argue right now whether the death penalty is wrong or right. That's off-topic).
- A woman is raped by a man. A camara from a bar films the video of the rapist face. Police don't know where nor who the rapist is. Police uses Artificial Intelligence to find people that have a similar face to the rapist. They find 9 people with the same face in the USA. They take DNA samples from each of these 9 people and compare them to the DNA from the semen found in the victim's body. It happens that 1 of these 9 people have the exact same DNA. They start investigating this man and find that his friends were in the same bar of the crime scene and the same day of the crime. His friends tell police that that they were with the current suspect at that bar on that day. The man goes to trial, more evidence of the crime is found and he is sentenced.
As you can see, I don't support using Artificial Intelligence as definitive evidence to sentence someone to prison time nor death penalty. But I think that it can make the process of finding possible/potential criminals much easier and much faster, and then allow police to start looking for evidence in one of each suspects. If police doesn't find evidence in one, multiple or all of the suspects, they should let them go. A suspect should only be sentenced if they find more evidence than the one from the Artificial Intelligence research. Of course, when I say that they should be sentenced if they find "more evidene" of the suspect, I mean solid and important evidence. I don't mean evidence such as "The suspect lives in the same city as the victim, therefore they are guilty". I mean high-quality evidence.
By the way, I don't know too much about crimes nor types of evidence nor the protocols of the police, so take what I say with a grain of salt. I'm just guessing how the process could be like.
RonBourbondi t1_jdzwb2z wrote
So don't use all your tools before they kill the child?
Mobely t1_je038qm wrote
Well if we're going down that road, almost all child kidnappings are done by the kids other parent and it's not to kill them.
So if we are looking to stop all child kidnappings that result in the child's murder, we would have an insanely high false positive result. You'd be jailing thousands of people, leaving their kids orphaned and vulnerable to violence. So yeah, don't use the shitty tools to cause more harm than good.
FP/N=FP/FP+TN
RonBourbondi t1_je065xw wrote
Nah because this goes off of pictures to identify themselves.
Not only that cops post pictures of suspects on news all the time. Thus AI is no different than crowdsurfing except it is more accurate and better.
Joe-Schmeaux t1_je0j9l0 wrote
So trust the police with even more powerful tools?
RonBourbondi t1_je0x8jl wrote
So hinder an investigation and have a child not be saved from murder?
Joe-Schmeaux t1_je15p0r wrote
So trade one set of murders for another?
RonBourbondi t1_je16edc wrote
Who's getting murdered from this?
Joe-Schmeaux t1_je170nr wrote
From the police misusing identification software and apprehending innocent people who end up in prison? Any such person would be at risk for murder or suicide. It's a shitty situation as is, let's not add things that can make it worse and give the already powerful, corrupt police forces of the world even more power. Trusting them to not misuse this can make things even worse, and we'll still have people being kidnapped.
RonBourbondi t1_je17iu7 wrote
So has there been a single case of them using the AI software and this happened?
Joe-Schmeaux t1_je18w64 wrote
I just googled it and this was the first article to come up. He spent ten days in jail and $5000 in legal defense. This was three years ago. He may not have suffered physical harm, but the potential for misuse and abuse of this kind of power is concerning.
usalsfyre t1_je37ndv wrote
You’re not supposed to deep throat the boot….
[deleted] t1_je03sfz wrote
[removed]
Caster-Hammer t1_je04ib5 wrote
Let's play "find the fascist."
RonBourbondi t1_je069c7 wrote
So you're against cops posting pictures of criminals on the news in a search for tips of who they are?
Caster-Hammer t1_je1zs0p wrote
So you're for moving the goalposts to defend an encroaching police state?
RonBourbondi t1_je20tvg wrote
If you want to call it that go ahead.
Nothing wrong using tools to track down criminals.
piTehT_tsuJ t1_je0162o wrote
It would be great if indeed it did the problem being facial recognition isn't anywhere near 100% accurate and this could lead to false arrests and the real kidnapper getting away.
RonBourbondi t1_je01dk0 wrote
Yeah I will gladly take a false arrest that can easily be cleared up over a dead child.
[deleted] t1_je03o9y wrote
[removed]
Antnee83 t1_je0brsg wrote
"easily cleared up"
Tell me you have no experience with this without telling me...
Have fun getting a job when every background check shows "ARRESTED FOR KIDNAPPING A CHILD." Good fucking luck "clearing that up easily"
RonBourbondi t1_je0c8i6 wrote
So are you also against releasing a suspects pictures on the news which can lead to tips and visits from the police even on incorrect identifications?
Because this is no different.
TogepiMain t1_je0e2zc wrote
I sure am! You know how many lives are ruined by being thrown up on the "suspect" wall? No one cares that they didn't do it, all that matters is that their photo was in the news with the words "probably did a crime??" Underneath
RonBourbondi t1_je0etka wrote
Yet countless people have been caught this way and has saved lives from posting pictures of the actual perpetrators to get a crowd source answer of who it is.
piTehT_tsuJ t1_je0ky6t wrote
Crowd source? Like Reddits hunt for the Boston bombers...
TogepiMain t1_je0fd8k wrote
Gonna need some actual numbers on that or else who can say.
Theyre probably not worth the damage, long term.
Paizzu t1_je0xlzu wrote
> "Think of the children" (also "What about the children?") is a cliché that evolved into a rhetorical tactic.
RonBourbondi t1_je0z8a0 wrote
Think of the kidnapped then. Lol.
If you have footage and pictures of the perp run them in an AI database to help narrow down suspects to then save lives.
Not particularly controversial.
[deleted] t1_je0fzgc wrote
[removed]
Viewing a single comment thread. View all comments