wrathtarw

wrathtarw t1_j1ocd3y wrote

The same bias that is present in the medical system is then programmed into the algorithm- the way machine learning works is that it essentially condenses information from the source and then uses it to determine the output. Garbage in garbage out…

If the source is flawed so too will the algorithm: https://developer.ibm.com/articles/machine-learning-and-bias/

And the source is flawed: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8344207/

5

wrathtarw t1_j1m3ij4 wrote

Things like this are not the helpful tools they appear to be. They are only as good as the researchers and data used to train them, and they are significantly biased by both

The opiate crisis is a disaster but it also has created significant problems for people who never have abused their medicines and need them to be functional.

https://internationalpain.org/how-the-opiate-crisis-has-affected-chronic-pain-sufferers/

https://www.sapiens.org/biology/chronic-pain-opioid-crackdown/

306