Submitted by ShakeNBakeGibson t3_10wblpv in IAmA
supertyson t1_j7mqwnc wrote
It's great that large datasets are being pulled in, but what are procedures around making sure that the data itself is good/useful?
IHaque_Recursion t1_j7n1vgs wrote
We run our experiments in house so that we can control the quality and relevance of the data. This type of attention to detail requires doing a lot of the unsexy behind-the-scenes operational improvements to control for as many 'exogenous' factors that can influence what actually takes place in our experimental wells. To manage this, we have (to an extent) backward integrated with our supply chain so that we can (i) anticipate where possible or (ii) correct for changes in the media our vendors supply, different coatings that suppliers may put on plates, etc... Additionally, we have built an incredibly robust tracking process that allows us to measure the meta data from every step in our multi-day assay, so that we maintain precise control over things like volume transfers, compound dwell times, plate movements, etc. to further ensure this relatability. I also wrote more earlier in the AMA about how we handle batch effects!
Viewing a single comment thread. View all comments