Submitted by GPUaccelerated t3_yf5jm3 in deeplearning
konze t1_iu37t3g wrote
I’m coming from academia with a lot of industry connections. Yes, there are a lot of companies that need fast DNN inference to point where they build custom ASICs just to fulfill their latency demands.
GPUaccelerated OP t1_iu4uxld wrote
That makes a lot of sense. And also really cool. Also, people resorting to ASICs for inference are definitely playing in the big boy leagues.
Thanks for sharing!
Viewing a single comment thread. View all comments