Comments

You must log in or register to comment.

StaffOfDoom t1_jeg5rbs wrote

You might be on to something...but slight tweak. Installing an app on their phone to (from the users' side) accept payments, make appointments, etc. but with a ChatGPT function that listens and observes passively through the device for signs and reports suspect comms!

−1

ThatGuyGetsIt t1_jeg781a wrote

I think it'd be too easy for nefarious actors to dupe an AI.

3

tDANGERb OP t1_jegac9h wrote

Have you seen the type of people that are foster parents that this would apply too? They are not masterminds. Again, this wouldn’t be a silver bullet but it would be a step in the right direction.

1

StaffOfDoom t1_jegbeiw wrote

To protect the children? Absolutely! Most of those kids have gone through enough bad to last a lifetime before they’re put in the system…whatever can be done to make sure the system doesn’t make things worse is great!

0

StaffOfDoom t1_jegbqw4 wrote

I take every chance I can to fight against that. There would have to be rules in place, like it can only activate inside the home and only record if there’s a certain trigger for example. This isn’t an easy road but if you’re fostering children then a certain amount of privacy rights needs to be set aside to safeguard against a heavily abused system. Just like you’re expected to be on camera if you’re in a store, you will expect to have questionable actions recorded and looked into.

1

just-a-dreamer- t1_jegfo1v wrote

Why would you harass foster parents? If you take the kids away, where do you put them?

My father was a judge and personally visited foster parents homes to see if everything was allright. But he could only do so much, for if pressed hard enough the parents just told him to take the kids if he wants them. Take them where? The courthouse?

There are reports that trucking companies can't find drivers for they put cams in trucks. Nobody likes to be surveiled 24/7. No foster parent will agree to have AI surveilance in their home either.

2

throwawayzeezeezee t1_jegn972 wrote

Foster services aren't underserved because a lack of humans to do it, it's underserved because of a lack of interest in funding it. If 100 million goes to US fostering bureaucracy annually, and this proposed model of ChatGPT can do what you suggest at a 10th of the cost, then the budget will simply go down to 10 million.

Not even touching the myriad number of 'ifs' involved, not least witch is the propensity for automated systems to disproportionately marginalize minority and poor people.

0

just-a-dreamer- t1_jegryit wrote

If you intrude in other people's lifes, there are consequences. I would never give a government agency access to my home with AI technology. So, the pool of foster parents to draw from would tank.

Likewise trucking companies are learning the hard way that drivers value their privacy even more than their paycheck. There is a case to be made that being homeless is better in comparison.

Regardless, all kids from whatever background should get checked in school for signs of abuse. CPS should only get involved with probable cause.

3

just-a-dreamer- t1_jegvhqj wrote

Better tax the rich and get the services scaled up. I think there will be many people that look for new careers anyway.

The goal of automation must be redirecting human labor towards better services on quality and quantity eventually.

But without taxing the rich, or bringing down capitalism eventually, it is all pretty pointless anyway.

2