Viewing a single comment thread. View all comments

tech_ml_an_co t1_izt4bd0 wrote

Quite different tech stack for APIs. DL requires some kind of model server with GPU. For traditional ML use Lambda or FastAPI on a server.

For batch processing it's more similar, depending on your data size, you might not need a GPU even for Deep learning.

Also deep learning is usually unstructured data, which requires different storage and training infrastructure.

You can read books about that topic however at the core that's the difference a that's why a lot of companies still don't utilize DL.

2

digital-bolkonsky OP t1_izt4z79 wrote

Right so when it comes to computing if I am building a DL api for someone. How should I address the computing issue?

0