DeadliestPoof
DeadliestPoof t1_j49t18v wrote
Reply to Don't add "moral bloatware" to GPT-4. by SpinRed
Question OP, pure intrigue no particular intent.
Do you view AI as a tool? Such as anything that can be wielded such as a hammer isn’t good or bad but if you strike a nail head Vs a human head the tool doesn’t have an alignment?
Or
Or view AI as a entity? That AI will have “intent” at some point and if morality rules aren’t placed it would purely operate in a logical manner in the same way a living organism would operate off “survival instincts” it purely operates off of “logical intent”
I’m hoping this made sense… human lack of intelligence generated this comment
DeadliestPoof t1_j49s08o wrote
Reply to comment by Memomomomo in Don't add "moral bloatware" to GPT-4. by SpinRed
“You never go full retard” x)
DeadliestPoof t1_j5se0d6 wrote
Reply to comment by MariualizeLegalhuana in Future-Proof Jobs by [deleted]
I agree with nursing but for different reasons. That making machines that replace the complex movements and diverse amount of tasks that a nurse completes would be very costly rather than keeping human labor.
Truth be told, my hypothesis is most would rather have an AI interface intertwined with customer service protocols, than a most likely overworked healthcare worker that becomes burnt out by the endless sub par treatment from fellow humans. AI wouldn’t have to worry about burn out, fatigue, and emotion
I expect most healthcare to become glorified nurses with AI handling a majority of all diagnosis and instruction to staff. Surgeons & ER Doctors will be relatively safe but expect shrinkage even in those positions because of the optimization of senior positions