Submitted by deepestdescent t3_z4oxq5 in MachineLearning
Hydreigon92 t1_ixs6lui wrote
Maybe InterpretML? It's developed and maintained by Microsoft Research and consolidates a lot of different explainability methods, including SHAP, into a consistent API.
deepestdescent OP t1_ixs7uig wrote
Thank you for that. Does it use the shap library under the hood, or does it have its own implementation for computing Shapley values? If it is using the shap, then I wouldn’t really call this an alternative as it is using the same unmaintained backend.
deepestdescent OP t1_ixsnb2v wrote
Yep so InterpretML does use the unmaintained shap library unfortunately. Looks like the author of shap works for Microsoft though so maybe he also works on InterpretML? I just don’t understand why shap isn’t being actively maintained since so many projects rely on it.
memberjan6 t1_ixv9h74 wrote
You might actually be the best candidate to begin to maintain shap.
Viewing a single comment thread. View all comments