yubozhao
yubozhao t1_iu9qzm3 wrote
Reply to comment by braintampon in [D] How to get the fastest PyTorch inference and what is the "best" model serving framework? by big_dog_2k
I guess others see this is spammy or ads? Honestly I disclosure who I am and didn’t try to sell (from my pov). I guess that’s not welcomed in this sub. shrug.jpg
Edit: typo
yubozhao t1_iu6huvq wrote
Reply to [D] How to get the fastest PyTorch inference and what is the "best" model serving framework? by big_dog_2k
Hello. I am the founder of BentoML. We are working on integration with triton and other high-performance serving runtime solutions.
yubozhao t1_iufs7wp wrote
Reply to comment by programmerChilli in [D] How to get the fastest PyTorch inference and what is the "best" model serving framework? by big_dog_2k
Fair enough. I will probably get to it. I don’t know about you. I need to “charge up” and make sure my answer is good. That takes time and it was the Halloween weeks after all.