Submitted by ADefiniteDescription t3_z1wim3 in philosophy
eliyah23rd t1_ixd8ne1 wrote
Amazed that the article does not mention "Minority Report". Spoiler! >!The movie posits a future where the tech is so advanced, that the police know in advance when the crime will be committed. (Pity the movie turned to Psychics instead.)!<
If today the program can tell the neighborhood, tomorrow it will be the street. Will we hit quantum effects before we can tell which house and when?
However, algorithm and computing power are not the only parameters. If we add extensive and invasive data collection to the process, the path from today to that moment is quite evident.
The question is (1) Do we want to continue increasing the data collection levels (you could argue that it will correlate to safety for some) (2) Do we want to keep this data collection in the hands of opaque institutions? (OTOH if you make it more public the chance of a leak, arguably, increases)
One last point. You'd be amazed how useful "innocent" incidental data is. Just the expressions on faces or even clothing style and gait may correlate to other data in unexpected ways.
d4em t1_ixdqech wrote
>One last point. You'd be amazed how useful "innocent" incidental data is. Just the expressions on faces or even clothing style and gait may correlate to other data in unexpected ways.
Looking angry on your way home because you got a cancer diagnosis and you're convinced life hates you? The police will now do you the honor of frisking you because you were identified as a possible suspect!
Are you a person of color that recently immigrated? Were you aware immigrants and persons of color are disproportionally responsible for crimes in your area? The police algorithms sure are!
This is an ethical nightmare. People shouldn't be suspect based on innocent information. Even holding them suspect for a future crime because of one they committed in the past is iffy. There's a line between vigilance and paranoia that's being crossed here.
And neither should we monitor everything out of the neurotic obsession someone might do something that's not allowed. Again, crossing the line between vigilance and paranoia. Like, crossing the line so far that the line is now a distant memory that we're not really sure ever existed. Complete safety is not an argument. Life isn't safe and it doesn't have to be. We all suffer, we all die. There is a need to strike a balance, so we can do other things besides suffering and dying. Neither safety nor danger should control our every second.
bildramer t1_ixi99wd wrote
On the one hand, sure, I want to be free to murder people if I really want, and free of creepy 24/7 observation, and people shouldn't assume things about me even if they're 100% accurate, and I would never trust anyone who wants to put cameras on me who claims it comes from a desire to reduce murders - let alone if it's lesser crimes.
On the other hand, if we really had a magical technology that allowed us to predict and stop murders with perfect accuracy and without the usual surveillance indignities and risks, it would be criminal not to use it. That hypothetical wouldn't be just another way for the powerful to assert themselves. And the problem with using it for other crimes is mostly that certain actions shouldn't be criminal, i.e. that the law is not lenient enough or not specific enough (perhaps for good reasons). In an ideal world with better institutions, we would resolve such a problem by changing the law.
eliyah23rd t1_ixdzjjc wrote
That might happen and it's a danger but that's not the mainline scenario.
Data being collected on facial expressions in the billions is more likely. Then you correlate that with other stuff. Bottom line, it's as if the cameras are installed in the privacy of your home, because mountains of data in public provides the missing data in private.
Then you correlate the inferred private stuff with more stuff. That's how you build "Minority Report"
d4em t1_ixe1anb wrote
>Data being collected on facial expressions in the billions is more likely. Then you correlate that with other stuff. Bottom line, it's as if the cameras are installed in the privacy of your home, because mountains of data in public provides the missing data in private.
I would say this constitutes "monitoring everything out of the neurotic obsession someone might do something that's not allowed", wouldn't you?
draculamilktoast t1_ixdhekk wrote
> (1) Do we want to continue increasing the data collection levels (you could argue that it will correlate to safety for some)
Yes, because we wish to extinguish privacy.
> (2) Do we want to keep this data collection in the hands of opaque institutions?
Yes, because we crave post-orwellian authoritarianism so nightmarish it makes North Korea look like anarchy.
I'm not being sarcastic, I'm making observations.
eliyah23rd t1_ixdofo9 wrote
We, the watched, need to seize the power to choose.
I'm looking for really practical suggestions about how to get this going.
RFF671 t1_ixd8sjm wrote
The spoiler tag is messed up on the formatting, it didn't hide the actual spoiler.
eliyah23rd t1_ixddjv7 wrote
Thank you. I have never tried to use the feature before and was not aware of what the protocol was.
Do you think, BTW, that for older movie and such a general comment it is necessary to take this precaution?
Anyway, fixed it. If this had been the first thing I learned today, I would say that it was wort getting up this morning. But, thankfully, my day has been full of such experiences. ;)
RFF671 t1_ixdfi5s wrote
It might not be necessary but you took the effort and I figured letting you know about it was in line with your original intention.
May the rest of your day look up from here! And the funny thing is, I think 'wort' was supposed to read as 'worst'. Ironically, I'm an avid brewer so a wort day is very good day indeed, lol.
eliyah23rd t1_ixdn7ad wrote
>wort
:laughing:
(I keep hoping that somebody is reading reddit with a proper markdown viewer. Emoticons don't work for me here.)
BatmanDuck123 t1_ixdkinv wrote
have u watched this
eliyah23rd t1_ixdr4b4 wrote
Fantastic video. Thank you.
This is the biggest thing happening on an ethical and social level IMO.
I am proficient with the tech. I can write Transformers, download HuggingFace models, and I know what these words mean. But I have no idea about the ramifications of this stuff on society. The people making policy, I am sure, know even less than me, and probably nothing about the the technology.
We need to give control of these changes to the broadest group possible.
The light of the sun has the power to purify.
flow-addict t1_ixdnib8 wrote
It might have the opposite effect. Being denied privacy could make people revolt violently. Why would they respect society and it's people when they are so disrespected they can't have even any privacy?
eliyah23rd t1_ixdrb7o wrote
Maybe it will and maybe it won't. Who knows?
flow-addict t1_ixdyab9 wrote
That's good enough (not making hasty assumptions)
Viewing a single comment thread. View all comments