programmerChilli t1_iufqn15 wrote
Reply to comment by yubozhao in [D] How to get the fastest PyTorch inference and what is the "best" model serving framework? by big_dog_2k
Well, you disclosed who you are, but that's pretty much all you did :P
The OP asked a number of questions, and you didn't really answer any of them. You didn't explain what BentoML can offer, you didn't explain how it can speed up inference, you didn't really even explain what BentoML is.
Folks will tolerate "advertising" if it comes in the form of interesting technical content. However, you basically just mentioned your company and provided no technical content, so it's just pure negative value from most people's perspective.
yubozhao t1_iufs7wp wrote
Fair enough. I will probably get to it. I don’t know about you. I need to “charge up” and make sure my answer is good. That takes time and it was the Halloween weeks after all.
Viewing a single comment thread. View all comments