Submitted by Ssider69 t3_11apphs in technology
drawkbox t1_j9tmxme wrote
Reply to comment by Effective-Avocado470 in Microsoft Bing AI ends chat when prompted about 'feelings' by Ssider69
Yeah devs aren't really in control when they feed in the datasets. Over time, there will be manipulation/pollution of datasets whether deliberate or unwittingly and it can have unexpected results. Any system that really needs to be logical should really think if it wants that attack vector. For things like idea generation this may be good, for standard data gets or decision trees that have liability, probably not.
Unity game engine has an ad network that this happened to, one quarter their ads were really out of wack and it was due to bad datasets. AI can be a business risk because it did cause revenue issues. We are going to be hearing more and more of these stories.
The Curious Case of Unity: Where ML & Wall Street Meet
> One of the biggest game developers in the world sees close to $5 billion in market cap wiped out due to a fault in their ML models
Viewing a single comment thread. View all comments